18:46:07.334 INFO MiniDFSCluster - starting cluster: numNameNodes=1, numDataNodes=1
18:46:07.537 INFO NameNode - Formatting using clusterid: testClusterID
18:46:07.547 INFO FSEditLog - Edit logging is async:true
18:46:07.563 INFO FSNamesystem - KeyProvider: null
18:46:07.565 INFO FSNamesystem - fsLock is fair: true
18:46:07.565 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
18:46:07.565 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
18:46:07.565 INFO FSNamesystem - supergroup = supergroup
18:46:07.565 INFO FSNamesystem - isPermissionEnabled = true
18:46:07.565 INFO FSNamesystem - isStoragePolicyEnabled = true
18:46:07.565 INFO FSNamesystem - HA Enabled: false
18:46:07.597 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
18:46:07.601 INFO deprecation - hadoop.configured.node.mapping is deprecated. Instead, use net.topology.configured.node.mapping
18:46:07.601 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
18:46:07.601 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
18:46:07.603 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
18:46:07.603 INFO BlockManager - The block deletion will start around 2025 May 19 18:46:07
18:46:07.604 INFO GSet - Computing capacity for map BlocksMap
18:46:07.604 INFO GSet - VM type = 64-bit
18:46:07.604 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
18:46:07.604 INFO GSet - capacity = 2^23 = 8388608 entries
18:46:07.612 INFO BlockManager - Storage policy satisfier is disabled
18:46:07.612 INFO BlockManager - dfs.block.access.token.enable = false
18:46:07.616 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
18:46:07.616 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
18:46:07.616 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
18:46:07.616 INFO BlockManager - defaultReplication = 1
18:46:07.616 INFO BlockManager - maxReplication = 512
18:46:07.616 INFO BlockManager - minReplication = 1
18:46:07.616 INFO BlockManager - maxReplicationStreams = 2
18:46:07.616 INFO BlockManager - redundancyRecheckInterval = 3000ms
18:46:07.616 INFO BlockManager - encryptDataTransfer = false
18:46:07.616 INFO BlockManager - maxNumBlocksToLog = 1000
18:46:07.634 INFO FSDirectory - GLOBAL serial map: bits=29 maxEntries=536870911
18:46:07.634 INFO FSDirectory - USER serial map: bits=24 maxEntries=16777215
18:46:07.634 INFO FSDirectory - GROUP serial map: bits=24 maxEntries=16777215
18:46:07.634 INFO FSDirectory - XATTR serial map: bits=24 maxEntries=16777215
18:46:07.640 INFO GSet - Computing capacity for map INodeMap
18:46:07.640 INFO GSet - VM type = 64-bit
18:46:07.640 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
18:46:07.640 INFO GSet - capacity = 2^22 = 4194304 entries
18:46:07.644 INFO FSDirectory - ACLs enabled? true
18:46:07.644 INFO FSDirectory - POSIX ACL inheritance enabled? true
18:46:07.644 INFO FSDirectory - XAttrs enabled? true
18:46:07.644 INFO NameNode - Caching file names occurring more than 10 times
18:46:07.648 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
18:46:07.649 INFO SnapshotManager - SkipList is disabled
18:46:07.652 INFO GSet - Computing capacity for map cachedBlocks
18:46:07.653 INFO GSet - VM type = 64-bit
18:46:07.653 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
18:46:07.653 INFO GSet - capacity = 2^20 = 1048576 entries
18:46:07.658 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
18:46:07.658 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
18:46:07.658 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
18:46:07.660 INFO FSNamesystem - Retry cache on namenode is enabled
18:46:07.660 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
18:46:07.661 INFO GSet - Computing capacity for map NameNodeRetryCache
18:46:07.661 INFO GSet - VM type = 64-bit
18:46:07.661 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
18:46:07.661 INFO GSet - capacity = 2^17 = 131072 entries
18:46:07.674 INFO FSImage - Allocated new BlockPoolId: BP-1968466779-10.1.0.176-1747680367669
18:46:07.679 INFO Storage - Storage directory /tmp/minicluster_storage13238592372457082651/name-0-1 has been successfully formatted.
18:46:07.680 INFO Storage - Storage directory /tmp/minicluster_storage13238592372457082651/name-0-2 has been successfully formatted.
18:46:07.694 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage13238592372457082651/name-0-1/current/fsimage.ckpt_0000000000000000000 using no compression
18:46:07.694 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage13238592372457082651/name-0-2/current/fsimage.ckpt_0000000000000000000 using no compression
18:46:07.926 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage13238592372457082651/name-0-2/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
18:46:07.926 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage13238592372457082651/name-0-1/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
18:46:07.938 INFO NNStorageRetentionManager - Going to retain 1 images with txid >= 0
18:46:08.009 INFO FSNamesystem - Stopping services started for active state
18:46:08.009 INFO FSNamesystem - Stopping services started for standby state
18:46:08.010 INFO NameNode - createNameNode []
18:46:08.047 WARN MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
18:46:08.055 INFO MetricsSystemImpl - Scheduled Metric snapshot period at 10 second(s).
18:46:08.055 INFO MetricsSystemImpl - NameNode metrics system started
18:46:08.059 INFO NameNodeUtils - fs.defaultFS is hdfs://127.0.0.1:0
18:46:08.083 INFO JvmPauseMonitor - Starting JVM pause monitor
18:46:08.094 INFO DFSUtil - Filter initializers set : org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
18:46:08.098 INFO DFSUtil - Starting Web-server for hdfs at: http://localhost:0
18:46:08.108 INFO log - Logging initialized @26608ms to org.eclipse.jetty.util.log.Slf4jLog
18:46:08.177 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
18:46:08.181 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
18:46:08.185 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
18:46:08.187 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
18:46:08.187 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
18:46:08.189 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context hdfs
18:46:08.189 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context static
18:46:08.221 INFO HttpServer2 - addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
18:46:08.225 INFO HttpServer2 - Jetty bound to port 35883
18:46:08.226 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
18:46:08.247 INFO session - DefaultSessionIdManager workerName=node0
18:46:08.247 INFO session - No SessionScavenger set, using defaults
18:46:08.248 INFO session - node0 Scavenging every 600000ms
18:46:08.259 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
18:46:08.261 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@1a8b2ab3{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
18:46:08.386 INFO ContextHandler - Started o.e.j.w.WebAppContext@3f31ff7a{hdfs,/,file:///tmp/jetty-localhost-35883-hadoop-hdfs-3_3_6-tests_jar-_-any-2156833862804811847/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/hdfs}
18:46:08.391 INFO AbstractConnector - Started ServerConnector@236a641a{HTTP/1.1, (http/1.1)}{localhost:35883}
18:46:08.391 INFO Server - Started @26892ms
18:46:08.395 INFO FSEditLog - Edit logging is async:true
18:46:08.404 INFO FSNamesystem - KeyProvider: null
18:46:08.404 INFO FSNamesystem - fsLock is fair: true
18:46:08.404 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
18:46:08.404 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
18:46:08.404 INFO FSNamesystem - supergroup = supergroup
18:46:08.404 INFO FSNamesystem - isPermissionEnabled = true
18:46:08.404 INFO FSNamesystem - isStoragePolicyEnabled = true
18:46:08.404 INFO FSNamesystem - HA Enabled: false
18:46:08.404 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
18:46:08.404 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
18:46:08.404 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
18:46:08.405 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
18:46:08.405 INFO BlockManager - The block deletion will start around 2025 May 19 18:46:08
18:46:08.405 INFO GSet - Computing capacity for map BlocksMap
18:46:08.405 INFO GSet - VM type = 64-bit
18:46:08.405 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
18:46:08.405 INFO GSet - capacity = 2^23 = 8388608 entries
18:46:08.407 INFO BlockManager - Storage policy satisfier is disabled
18:46:08.407 INFO BlockManager - dfs.block.access.token.enable = false
18:46:08.407 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
18:46:08.407 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
18:46:08.407 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
18:46:08.407 INFO BlockManager - defaultReplication = 1
18:46:08.407 INFO BlockManager - maxReplication = 512
18:46:08.407 INFO BlockManager - minReplication = 1
18:46:08.407 INFO BlockManager - maxReplicationStreams = 2
18:46:08.407 INFO BlockManager - redundancyRecheckInterval = 3000ms
18:46:08.407 INFO BlockManager - encryptDataTransfer = false
18:46:08.407 INFO BlockManager - maxNumBlocksToLog = 1000
18:46:08.407 INFO GSet - Computing capacity for map INodeMap
18:46:08.407 INFO GSet - VM type = 64-bit
18:46:08.407 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
18:46:08.407 INFO GSet - capacity = 2^22 = 4194304 entries
18:46:08.408 INFO FSDirectory - ACLs enabled? true
18:46:08.408 INFO FSDirectory - POSIX ACL inheritance enabled? true
18:46:08.408 INFO FSDirectory - XAttrs enabled? true
18:46:08.408 INFO NameNode - Caching file names occurring more than 10 times
18:46:08.408 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
18:46:08.409 INFO SnapshotManager - SkipList is disabled
18:46:08.409 INFO GSet - Computing capacity for map cachedBlocks
18:46:08.409 INFO GSet - VM type = 64-bit
18:46:08.409 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
18:46:08.409 INFO GSet - capacity = 2^20 = 1048576 entries
18:46:08.409 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
18:46:08.409 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
18:46:08.409 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
18:46:08.409 INFO FSNamesystem - Retry cache on namenode is enabled
18:46:08.409 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
18:46:08.409 INFO GSet - Computing capacity for map NameNodeRetryCache
18:46:08.409 INFO GSet - VM type = 64-bit
18:46:08.409 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
18:46:08.409 INFO GSet - capacity = 2^17 = 131072 entries
18:46:08.412 INFO Storage - Lock on /tmp/minicluster_storage13238592372457082651/name-0-1/in_use.lock acquired by nodename 3117@pkrvmf6wy0o8zjz
18:46:08.414 INFO Storage - Lock on /tmp/minicluster_storage13238592372457082651/name-0-2/in_use.lock acquired by nodename 3117@pkrvmf6wy0o8zjz
18:46:08.415 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage13238592372457082651/name-0-1/current
18:46:08.415 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage13238592372457082651/name-0-2/current
18:46:08.415 INFO FSImage - No edit log streams selected.
18:46:08.415 INFO FSImage - Planning to load image: FSImageFile(file=/tmp/minicluster_storage13238592372457082651/name-0-1/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
18:46:08.430 INFO FSImageFormatPBINode - Loading 1 INodes.
18:46:08.431 INFO FSImageFormatPBINode - Successfully loaded 1 inodes
18:46:08.433 INFO FSImageFormatPBINode - Completed update blocks map and name cache, total waiting duration 0ms.
18:46:08.434 INFO FSImageFormatProtobuf - Loaded FSImage in 0 seconds.
18:46:08.434 INFO FSImage - Loaded image for txid 0 from /tmp/minicluster_storage13238592372457082651/name-0-1/current/fsimage_0000000000000000000
18:46:08.436 INFO FSNamesystem - Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
18:46:08.437 INFO FSEditLog - Starting log segment at 1
18:46:08.445 INFO NameCache - initialized with 0 entries 0 lookups
18:46:08.445 INFO FSNamesystem - Finished loading FSImage in 35 msecs
18:46:08.508 INFO NameNode - RPC server is binding to localhost:0
18:46:08.508 INFO NameNode - Enable NameNode state context:false
18:46:08.511 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
18:46:08.518 INFO Server - Listener at localhost:36797
18:46:08.519 INFO Server - Starting Socket Reader #1 for port 0
18:46:08.540 INFO NameNode - Clients are to use localhost:36797 to access this namenode/service.
18:46:08.542 INFO FSNamesystem - Registered FSNamesystemState, ReplicatedBlocksState and ECBlockGroupsState MBeans.
18:46:08.557 INFO LeaseManager - Number of blocks under construction: 0
18:46:08.562 INFO DatanodeAdminDefaultMonitor - Initialized the Default Decommission and Maintenance monitor
18:46:08.563 INFO BlockManager - Start MarkedDeleteBlockScrubber thread
18:46:08.564 INFO BlockManager - initializing replication queues
18:46:08.564 INFO StateChange - STATE* Leaving safe mode after 0 secs
18:46:08.564 INFO StateChange - STATE* Network topology has 0 racks and 0 datanodes
18:46:08.564 INFO StateChange - STATE* UnderReplicatedBlocks has 0 blocks
18:46:08.568 INFO BlockManager - Total number of blocks = 0
18:46:08.568 INFO BlockManager - Number of invalid blocks = 0
18:46:08.568 INFO BlockManager - Number of under-replicated blocks = 0
18:46:08.568 INFO BlockManager - Number of over-replicated blocks = 0
18:46:08.568 INFO BlockManager - Number of blocks being written = 0
18:46:08.568 INFO StateChange - STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 4 msec
18:46:08.579 INFO Server - IPC Server Responder: starting
18:46:08.579 INFO Server - IPC Server listener on 0: starting
18:46:08.581 INFO NameNode - NameNode RPC up at: localhost/127.0.0.1:36797
18:46:08.581 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
18:46:08.582 INFO FSNamesystem - Starting services required for active state
18:46:08.582 INFO FSDirectory - Initializing quota with 12 thread(s)
18:46:08.584 INFO FSDirectory - Quota initialization completed in 3 milliseconds
name space=1
storage space=0
storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0, PROVIDED=0
18:46:08.587 INFO CacheReplicationMonitor - Starting CacheReplicationMonitor with interval 30000 milliseconds
18:46:08.594 INFO MiniDFSCluster - Starting DataNode 0 with dfs.datanode.data.dir: [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data1,[DISK]file:/tmp/minicluster_storage13238592372457082651/data/data2
18:46:08.607 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data1
18:46:08.615 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data2
18:46:08.626 INFO MetricsSystemImpl - DataNode metrics system started (again)
18:46:08.631 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
18:46:08.633 INFO BlockScanner - Initialized block scanner with targetBytesPerSec 1048576
18:46:08.635 INFO DataNode - Configured hostname is 127.0.0.1
18:46:08.636 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
18:46:08.637 INFO DataNode - Starting DataNode with maxLockedMemory = 0
18:46:08.640 INFO DataNode - Opened streaming server at /127.0.0.1:38019
18:46:08.641 INFO DataNode - Balancing bandwidth is 104857600 bytes/s
18:46:08.641 INFO DataNode - Number threads for balancing is 100
18:46:08.645 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
18:46:08.645 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
18:46:08.646 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
18:46:08.647 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
18:46:08.647 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
18:46:08.648 INFO HttpServer2 - Jetty bound to port 40963
18:46:08.648 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
18:46:08.649 INFO session - DefaultSessionIdManager workerName=node0
18:46:08.649 INFO session - No SessionScavenger set, using defaults
18:46:08.649 INFO session - node0 Scavenging every 600000ms
18:46:08.649 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@5a12819f{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
18:46:08.739 INFO ContextHandler - Started o.e.j.w.WebAppContext@23c424c6{datanode,/,file:///tmp/jetty-localhost-40963-hadoop-hdfs-3_3_6-tests_jar-_-any-15079480189751032782/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/datanode}
18:46:08.740 INFO AbstractConnector - Started ServerConnector@7194dae5{HTTP/1.1, (http/1.1)}{localhost:40963}
18:46:08.740 INFO Server - Started @27241ms
18:46:08.744 WARN DatanodeHttpServer - Got null for restCsrfPreventionFilter - will not do any filtering.
18:46:08.745 INFO DatanodeHttpServer - Listening HTTP traffic on /127.0.0.1:34801
18:46:08.745 INFO JvmPauseMonitor - Starting JVM pause monitor
18:46:08.746 INFO DataNode - dnUserName = runner
18:46:08.746 INFO DataNode - supergroup = supergroup
18:46:08.753 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
18:46:08.754 INFO Server - Listener at localhost:32847
18:46:08.754 INFO Server - Starting Socket Reader #1 for port 0
18:46:08.757 INFO DataNode - Opened IPC server at /127.0.0.1:32847
18:46:08.771 INFO DataNode - Refresh request received for nameservices: null
18:46:08.772 INFO DataNode - Starting BPOfferServices for nameservices: <default>
18:46:08.778 INFO DataNode - Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:36797 starting to offer service
18:46:08.779 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
18:46:08.780 INFO Server - IPC Server Responder: starting
18:46:08.780 INFO Server - IPC Server listener on 0: starting
18:46:08.878 INFO DataNode - Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:36797
18:46:08.880 INFO Storage - Using 2 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=2, dataDirs=2)
18:46:08.881 INFO Storage - Lock on /tmp/minicluster_storage13238592372457082651/data/data1/in_use.lock acquired by nodename 3117@pkrvmf6wy0o8zjz
18:46:08.882 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data1 is not formatted for namespace 299576541. Formatting...
18:46:08.882 INFO Storage - Generated new storageID DS-8758914d-d38f-402f-8798-7e5675c4afaf for directory /tmp/minicluster_storage13238592372457082651/data/data1
18:46:08.886 INFO Storage - Lock on /tmp/minicluster_storage13238592372457082651/data/data2/in_use.lock acquired by nodename 3117@pkrvmf6wy0o8zjz
18:46:08.886 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data2 is not formatted for namespace 299576541. Formatting...
18:46:08.886 INFO Storage - Generated new storageID DS-62f0ccc1-8eeb-4abb-80bb-4af676bcf23c for directory /tmp/minicluster_storage13238592372457082651/data/data2
18:46:08.886 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
18:46:08.890 INFO MiniDFSCluster - dnInfo.length != numDataNodes
18:46:08.890 INFO MiniDFSCluster - Waiting for cluster to become active
18:46:08.906 INFO Storage - Analyzing storage directories for bpid BP-1968466779-10.1.0.176-1747680367669
18:46:08.906 INFO Storage - Locking is disabled for /tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669
18:46:08.906 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data1 and block pool id BP-1968466779-10.1.0.176-1747680367669 is not formatted. Formatting ...
18:46:08.906 INFO Storage - Formatting block pool BP-1968466779-10.1.0.176-1747680367669 directory /tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current
18:46:08.925 INFO Storage - Analyzing storage directories for bpid BP-1968466779-10.1.0.176-1747680367669
18:46:08.925 INFO Storage - Locking is disabled for /tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669
18:46:08.925 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data2 and block pool id BP-1968466779-10.1.0.176-1747680367669 is not formatted. Formatting ...
18:46:08.925 INFO Storage - Formatting block pool BP-1968466779-10.1.0.176-1747680367669 directory /tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current
18:46:08.927 INFO DataNode - Setting up storage: nsid=299576541;bpid=BP-1968466779-10.1.0.176-1747680367669;lv=-57;nsInfo=lv=-66;cid=testClusterID;nsid=299576541;c=1747680367669;bpid=BP-1968466779-10.1.0.176-1747680367669;dnuuid=null
18:46:08.929 INFO DataNode - Generated and persisted new Datanode UUID f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d
18:46:08.940 INFO FsDatasetImpl - The datanode lock is a read write lock
18:46:08.964 INFO FsDatasetImpl - Added new volume: DS-8758914d-d38f-402f-8798-7e5675c4afaf
18:46:08.964 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data1, StorageType: DISK
18:46:08.965 INFO FsDatasetImpl - Added new volume: DS-62f0ccc1-8eeb-4abb-80bb-4af676bcf23c
18:46:08.965 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage13238592372457082651/data/data2, StorageType: DISK
18:46:08.967 INFO MemoryMappableBlockLoader - Initializing cache loader: MemoryMappableBlockLoader.
18:46:08.970 INFO FsDatasetImpl - Registered FSDatasetState MBean
18:46:08.972 INFO FsDatasetImpl - Adding block pool BP-1968466779-10.1.0.176-1747680367669
18:46:08.973 INFO FsDatasetImpl - Scanning block pool BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data1...
18:46:08.973 INFO FsDatasetImpl - Scanning block pool BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data2...
18:46:08.977 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current, will proceed with Du for space computation calculation,
18:46:08.977 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current, will proceed with Du for space computation calculation,
18:46:08.992 INFO FsDatasetImpl - Time taken to scan block pool BP-1968466779-10.1.0.176-1747680367669 on /tmp/minicluster_storage13238592372457082651/data/data2: 20ms
18:46:08.993 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
18:46:08.993 INFO MiniDFSCluster - dnInfo.length != numDataNodes
18:46:08.994 INFO MiniDFSCluster - Waiting for cluster to become active
18:46:08.994 INFO FsDatasetImpl - Time taken to scan block pool BP-1968466779-10.1.0.176-1747680367669 on /tmp/minicluster_storage13238592372457082651/data/data1: 22ms
18:46:08.994 INFO FsDatasetImpl - Total time to scan all replicas for block pool BP-1968466779-10.1.0.176-1747680367669: 22ms
18:46:08.995 INFO FsDatasetImpl - Adding replicas to map for block pool BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data1...
18:46:08.995 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/replicas doesn't exist
18:46:08.995 INFO FsDatasetImpl - Adding replicas to map for block pool BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data2...
18:46:08.995 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/replicas doesn't exist
18:46:08.995 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data1: 1ms
18:46:08.996 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data2: 1ms
18:46:08.996 INFO FsDatasetImpl - Total time to add all replicas to map for block pool BP-1968466779-10.1.0.176-1747680367669: 1ms
18:46:08.996 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage13238592372457082651/data/data1
18:46:09.000 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage13238592372457082651/data/data1
18:46:09.000 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage13238592372457082651/data/data2
18:46:09.000 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage13238592372457082651/data/data2
18:46:09.002 INFO VolumeScanner - Now scanning bpid BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data2
18:46:09.002 INFO VolumeScanner - Now scanning bpid BP-1968466779-10.1.0.176-1747680367669 on volume /tmp/minicluster_storage13238592372457082651/data/data1
18:46:09.015 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage13238592372457082651/data/data1, DS-8758914d-d38f-402f-8798-7e5675c4afaf): finished scanning block pool BP-1968466779-10.1.0.176-1747680367669
18:46:09.015 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage13238592372457082651/data/data2, DS-62f0ccc1-8eeb-4abb-80bb-4af676bcf23c): finished scanning block pool BP-1968466779-10.1.0.176-1747680367669
18:46:09.017 WARN DirectoryScanner - dfs.datanode.directoryscan.throttle.limit.ms.per.sec set to value above 1000 ms/sec. Assuming default value of -1
18:46:09.017 INFO DirectoryScanner - Periodic Directory Tree Verification scan starting in 17994138ms with interval of 21600000ms and throttle limit of -1ms/s
18:46:09.020 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage13238592372457082651/data/data2, DS-62f0ccc1-8eeb-4abb-80bb-4af676bcf23c): no suitable block pools found to scan. Waiting 1814399982 ms.
18:46:09.020 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage13238592372457082651/data/data1, DS-8758914d-d38f-402f-8798-7e5675c4afaf): no suitable block pools found to scan. Waiting 1814399982 ms.
18:46:09.021 INFO BlockManagerInfo - Removed broadcast_22_piece0 on localhost:45727 in memory (size: 159.0 B, free: 1920.0 MiB)
18:46:09.023 INFO BlockManagerInfo - Removed broadcast_32_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1920.0 MiB)
18:46:09.023 INFO DataNode - Block pool BP-1968466779-10.1.0.176-1747680367669 (Datanode Uuid f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d) service to localhost/127.0.0.1:36797 beginning handshake with NN
18:46:09.027 INFO BlockManager - Removing RDD 47
18:46:09.031 INFO BlockManagerInfo - Removed broadcast_31_piece0 on localhost:45727 in memory (size: 320.0 B, free: 1920.0 MiB)
18:46:09.033 INFO BlockManagerInfo - Removed broadcast_23_piece0 on localhost:45727 in memory (size: 465.0 B, free: 1920.0 MiB)
18:46:09.037 INFO StateChange - BLOCK* registerDatanode: from DatanodeRegistration(127.0.0.1:38019, datanodeUuid=f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, infoPort=34801, infoSecurePort=0, ipcPort=32847, storageInfo=lv=-57;cid=testClusterID;nsid=299576541;c=1747680367669) storage f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d
18:46:09.038 INFO NetworkTopology - Adding a new node: /default-rack/127.0.0.1:38019
18:46:09.038 INFO BlockReportLeaseManager - Registered DN f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d (127.0.0.1:38019).
18:46:09.040 INFO BlockManagerInfo - Removed broadcast_30_piece0 on localhost:45727 in memory (size: 4.7 KiB, free: 1920.0 MiB)
18:46:09.042 INFO BlockManagerInfo - Removed broadcast_29_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1920.0 MiB)
18:46:09.042 INFO DataNode - Block pool BP-1968466779-10.1.0.176-1747680367669 (Datanode Uuid f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d) service to localhost/127.0.0.1:36797 successfully registered with NN
18:46:09.043 INFO DataNode - For namenode localhost/127.0.0.1:36797 using BLOCKREPORT_INTERVAL of 21600000msecs CACHEREPORT_INTERVAL of 10000msecs Initial delay: 0msecs; heartBeatInterval=3000
18:46:09.043 INFO DataNode - Starting IBR Task Handler.
18:46:09.044 INFO BlockManagerInfo - Removed broadcast_28_piece0 on localhost:45727 in memory (size: 320.0 B, free: 1920.0 MiB)
18:46:09.046 INFO BlockManagerInfo - Removed broadcast_27_piece0 on localhost:45727 in memory (size: 5.1 KiB, free: 1920.0 MiB)
18:46:09.048 INFO BlockManagerInfo - Removed broadcast_33_piece0 on localhost:45727 in memory (size: 4.8 KiB, free: 1920.0 MiB)
18:46:09.054 INFO DatanodeDescriptor - Adding new storage ID DS-8758914d-d38f-402f-8798-7e5675c4afaf for DN 127.0.0.1:38019
18:46:09.054 INFO DatanodeDescriptor - Adding new storage ID DS-62f0ccc1-8eeb-4abb-80bb-4af676bcf23c for DN 127.0.0.1:38019
18:46:09.060 INFO DataNode - After receiving heartbeat response, updating state of namenode localhost:36797 to active
18:46:09.071 INFO BlockStateChange - BLOCK* processReport 0x2f5f619caa2005f1 with lease ID 0x6720f2debf3c3b17: Processing first storage report for DS-8758914d-d38f-402f-8798-7e5675c4afaf from datanode DatanodeRegistration(127.0.0.1:38019, datanodeUuid=f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, infoPort=34801, infoSecurePort=0, ipcPort=32847, storageInfo=lv=-57;cid=testClusterID;nsid=299576541;c=1747680367669)
18:46:09.072 INFO BlockStateChange - BLOCK* processReport 0x2f5f619caa2005f1 with lease ID 0x6720f2debf3c3b17: from storage DS-8758914d-d38f-402f-8798-7e5675c4afaf node DatanodeRegistration(127.0.0.1:38019, datanodeUuid=f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, infoPort=34801, infoSecurePort=0, ipcPort=32847, storageInfo=lv=-57;cid=testClusterID;nsid=299576541;c=1747680367669), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
18:46:09.072 INFO BlockStateChange - BLOCK* processReport 0x2f5f619caa2005f1 with lease ID 0x6720f2debf3c3b17: Processing first storage report for DS-62f0ccc1-8eeb-4abb-80bb-4af676bcf23c from datanode DatanodeRegistration(127.0.0.1:38019, datanodeUuid=f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, infoPort=34801, infoSecurePort=0, ipcPort=32847, storageInfo=lv=-57;cid=testClusterID;nsid=299576541;c=1747680367669)
18:46:09.072 INFO BlockStateChange - BLOCK* processReport 0x2f5f619caa2005f1 with lease ID 0x6720f2debf3c3b17: from storage DS-62f0ccc1-8eeb-4abb-80bb-4af676bcf23c node DatanodeRegistration(127.0.0.1:38019, datanodeUuid=f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, infoPort=34801, infoSecurePort=0, ipcPort=32847, storageInfo=lv=-57;cid=testClusterID;nsid=299576541;c=1747680367669), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
18:46:09.082 INFO DataNode - Successfully sent block report 0x2f5f619caa2005f1 with lease ID 0x6720f2debf3c3b17 to namenode: localhost/127.0.0.1:36797, containing 2 storage report(s), of which we sent 2. The reports had 0 total blocks and used 1 RPC(s). This took 2 msecs to generate and 19 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
18:46:09.082 INFO DataNode - Got finalize command for block pool BP-1968466779-10.1.0.176-1747680367669
18:46:09.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
18:46:09.100 INFO MiniDFSCluster - Cluster is active
18:46:09.163 INFO MemoryStore - Block broadcast_34 stored as values in memory (estimated size 297.9 KiB, free 1919.7 MiB)
18:46:09.183 INFO MemoryStore - Block broadcast_34_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.7 MiB)
18:46:09.183 INFO BlockManagerInfo - Added broadcast_34_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1920.0 MiB)
18:46:09.184 INFO SparkContext - Created broadcast 34 from newAPIHadoopFile at PathSplitSource.java:96
18:46:09.231 INFO MemoryStore - Block broadcast_35 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
18:46:09.241 INFO MemoryStore - Block broadcast_35_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
18:46:09.241 INFO BlockManagerInfo - Added broadcast_35_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.9 MiB)
18:46:09.242 INFO SparkContext - Created broadcast 35 from newAPIHadoopFile at PathSplitSource.java:96
18:46:09.299 INFO FileInputFormat - Total input files to process : 1
18:46:09.315 INFO MemoryStore - Block broadcast_36 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
18:46:09.319 INFO MemoryStore - Block broadcast_36_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
18:46:09.319 INFO BlockManagerInfo - Added broadcast_36_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.9 MiB)
18:46:09.320 INFO SparkContext - Created broadcast 36 from broadcast at ReadsSparkSink.java:133
18:46:09.328 INFO MemoryStore - Block broadcast_37 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
18:46:09.333 INFO MemoryStore - Block broadcast_37_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
18:46:09.333 INFO BlockManagerInfo - Added broadcast_37_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.9 MiB)
18:46:09.333 INFO SparkContext - Created broadcast 37 from broadcast at BamSink.java:76
18:46:09.348 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts dst=null perm=null proto=rpc
18:46:09.351 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:09.352 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:09.352 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:09.366 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:09.387 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:09.388 INFO DAGScheduler - Registering RDD 77 (mapToPair at SparkUtils.java:161) as input to shuffle 7
18:46:09.388 INFO DAGScheduler - Got job 20 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:09.388 INFO DAGScheduler - Final stage: ResultStage 30 (runJob at SparkHadoopWriter.scala:83)
18:46:09.388 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 29)
18:46:09.388 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 29)
18:46:09.389 INFO DAGScheduler - Submitting ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:09.426 INFO MemoryStore - Block broadcast_38 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
18:46:09.428 INFO MemoryStore - Block broadcast_38_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
18:46:09.429 INFO BlockManagerInfo - Added broadcast_38_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.7 MiB)
18:46:09.429 INFO SparkContext - Created broadcast 38 from broadcast at DAGScheduler.scala:1580
18:46:09.430 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:09.430 INFO TaskSchedulerImpl - Adding task set 29.0 with 1 tasks resource profile 0
18:46:09.434 INFO TaskSetManager - Starting task 0.0 in stage 29.0 (TID 67) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:09.434 INFO Executor - Running task 0.0 in stage 29.0 (TID 67)
18:46:09.506 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:09.569 INFO Executor - Finished task 0.0 in stage 29.0 (TID 67). 1148 bytes result sent to driver
18:46:09.570 INFO TaskSetManager - Finished task 0.0 in stage 29.0 (TID 67) in 140 ms on localhost (executor driver) (1/1)
18:46:09.571 INFO TaskSchedulerImpl - Removed TaskSet 29.0, whose tasks have all completed, from pool
18:46:09.571 INFO DAGScheduler - ShuffleMapStage 29 (mapToPair at SparkUtils.java:161) finished in 0.180 s
18:46:09.571 INFO DAGScheduler - looking for newly runnable stages
18:46:09.571 INFO DAGScheduler - running: HashSet()
18:46:09.571 INFO DAGScheduler - waiting: HashSet(ResultStage 30)
18:46:09.571 INFO DAGScheduler - failed: HashSet()
18:46:09.571 INFO DAGScheduler - Submitting ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91), which has no missing parents
18:46:09.581 INFO MemoryStore - Block broadcast_39 stored as values in memory (estimated size 241.5 KiB, free 1918.1 MiB)
18:46:09.582 INFO MemoryStore - Block broadcast_39_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.0 MiB)
18:46:09.582 INFO BlockManagerInfo - Added broadcast_39_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.7 MiB)
18:46:09.582 INFO SparkContext - Created broadcast 39 from broadcast at DAGScheduler.scala:1580
18:46:09.583 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:09.583 INFO TaskSchedulerImpl - Adding task set 30.0 with 1 tasks resource profile 0
18:46:09.584 INFO TaskSetManager - Starting task 0.0 in stage 30.0 (TID 68) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:09.584 INFO Executor - Running task 0.0 in stage 30.0 (TID 68)
18:46:09.599 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:09.599 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:09.680 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:09.680 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:09.680 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:09.681 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:09.681 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:09.681 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:09.699 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:09.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:09.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:09.735 INFO StateChange - BLOCK* allocate blk_1073741825_1001, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/part-r-00000
18:46:09.763 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741825_1001 src: /127.0.0.1:54636 dest: /127.0.0.1:38019
18:46:09.785 INFO clienttrace - src: /127.0.0.1:54636, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741825_1001, duration(ns): 4621136
18:46:09.786 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
18:46:09.790 INFO FSNamesystem - BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/part-r-00000
18:46:10.193 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:10.194 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:10.197 INFO StateChange - BLOCK* allocate blk_1073741826_1002, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/.part-r-00000.sbi
18:46:10.199 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741826_1002 src: /127.0.0.1:54642 dest: /127.0.0.1:38019
18:46:10.200 INFO clienttrace - src: /127.0.0.1:54642, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741826_1002, duration(ns): 667826
18:46:10.201 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
18:46:10.202 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:10.206 INFO StateChange - BLOCK* allocate blk_1073741827_1003, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/.part-r-00000.bai
18:46:10.207 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741827_1003 src: /127.0.0.1:54646 dest: /127.0.0.1:38019
18:46:10.209 INFO clienttrace - src: /127.0.0.1:54646, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741827_1003, duration(ns): 670732
18:46:10.209 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741827_1003, type=LAST_IN_PIPELINE terminating
18:46:10.210 INFO FSNamesystem - BLOCK* blk_1073741827_1003 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/.part-r-00000.bai
18:46:10.611 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:10.614 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0 dst=null perm=null proto=rpc
18:46:10.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0 dst=null perm=null proto=rpc
18:46:10.619 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/task_202505191846093412826888066902118_0082_r_000000 dst=null perm=null proto=rpc
18:46:10.624 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/_temporary/attempt_202505191846093412826888066902118_0082_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/task_202505191846093412826888066902118_0082_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:10.625 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846093412826888066902118_0082_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/task_202505191846093412826888066902118_0082_r_000000
18:46:10.626 INFO SparkHadoopMapRedUtil - attempt_202505191846093412826888066902118_0082_r_000000_0: Committed. Elapsed time: 8 ms.
18:46:10.627 INFO Executor - Finished task 0.0 in stage 30.0 (TID 68). 1858 bytes result sent to driver
18:46:10.628 INFO TaskSetManager - Finished task 0.0 in stage 30.0 (TID 68) in 1045 ms on localhost (executor driver) (1/1)
18:46:10.628 INFO TaskSchedulerImpl - Removed TaskSet 30.0, whose tasks have all completed, from pool
18:46:10.629 INFO DAGScheduler - ResultStage 30 (runJob at SparkHadoopWriter.scala:83) finished in 1.057 s
18:46:10.629 INFO DAGScheduler - Job 20 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:10.629 INFO TaskSchedulerImpl - Killing all running tasks in stage 30: Stage finished
18:46:10.629 INFO DAGScheduler - Job 20 finished: runJob at SparkHadoopWriter.scala:83, took 1.242526 s
18:46:10.631 INFO SparkHadoopWriter - Start to commit write Job job_202505191846093412826888066902118_0082.
18:46:10.634 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:10.636 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts dst=null perm=null proto=rpc
18:46:10.637 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/task_202505191846093412826888066902118_0082_r_000000 dst=null perm=null proto=rpc
18:46:10.638 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:10.639 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/task_202505191846093412826888066902118_0082_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:10.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:10.641 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/task_202505191846093412826888066902118_0082_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:10.642 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:10.643 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary/0/task_202505191846093412826888066902118_0082_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:10.649 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:10.651 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:10.652 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:10.655 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.spark-staging-82 dst=null perm=null proto=rpc
18:46:10.655 INFO SparkHadoopWriter - Write Job job_202505191846093412826888066902118_0082 committed. Elapsed time: 23 ms.
18:46:10.656 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:10.658 INFO StateChange - BLOCK* allocate blk_1073741828_1004, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/header
18:46:10.660 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741828_1004 src: /127.0.0.1:54650 dest: /127.0.0.1:38019
18:46:10.662 INFO clienttrace - src: /127.0.0.1:54650, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741828_1004, duration(ns): 724756
18:46:10.662 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741828_1004, type=LAST_IN_PIPELINE terminating
18:46:10.663 INFO FSNamesystem - BLOCK* blk_1073741828_1004 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/header
18:46:11.064 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:11.066 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:11.068 INFO StateChange - BLOCK* allocate blk_1073741829_1005, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/terminator
18:46:11.069 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741829_1005 src: /127.0.0.1:54656 dest: /127.0.0.1:38019
18:46:11.071 INFO clienttrace - src: /127.0.0.1:54656, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741829_1005, duration(ns): 664240
18:46:11.071 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741829_1005, type=LAST_IN_PIPELINE terminating
18:46:11.072 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:11.073 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts dst=null perm=null proto=rpc
18:46:11.079 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:11.080 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:11.080 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam
18:46:11.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:11.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:11.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:11.085 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam done
18:46:11.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:11.087 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi
18:46:11.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts dst=null perm=null proto=rpc
18:46:11.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:11.092 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:11.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:11.129 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:11.131 INFO StateChange - BLOCK* allocate blk_1073741830_1006, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi
18:46:11.132 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741830_1006 src: /127.0.0.1:54676 dest: /127.0.0.1:38019
18:46:11.133 INFO clienttrace - src: /127.0.0.1:54676, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741830_1006, duration(ns): 700487
18:46:11.133 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741830_1006, type=LAST_IN_PIPELINE terminating
18:46:11.134 INFO FSNamesystem - BLOCK* blk_1073741830_1006 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi
18:46:11.536 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:11.536 INFO IndexFileMerger - Done merging .sbi files
18:46:11.538 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai
18:46:11.538 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts dst=null perm=null proto=rpc
18:46:11.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:11.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:11.543 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:11.545 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:11.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:11.555 INFO StateChange - BLOCK* allocate blk_1073741831_1007, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai
18:46:11.556 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741831_1007 src: /127.0.0.1:54690 dest: /127.0.0.1:38019
18:46:11.557 INFO clienttrace - src: /127.0.0.1:54690, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741831_1007, duration(ns): 599834
18:46:11.557 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741831_1007, type=LAST_IN_PIPELINE terminating
18:46:11.558 INFO FSNamesystem - BLOCK* blk_1073741831_1007 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai
18:46:11.959 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:11.960 INFO IndexFileMerger - Done merging .bai files
18:46:11.961 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.parts dst=null perm=null proto=rpc
18:46:11.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:11.978 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi dst=null perm=null proto=rpc
18:46:11.979 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi dst=null perm=null proto=rpc
18:46:11.980 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi dst=null perm=null proto=rpc
18:46:11.983 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:11.983 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:11.984 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:11.984 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:11.985 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:11.987 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:11.987 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:11.988 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:11.990 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:11.995 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:11.996 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi dst=null perm=null proto=rpc
18:46:11.997 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi dst=null perm=null proto=rpc
18:46:11.997 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.sbi dst=null perm=null proto=rpc
18:46:11.999 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:11.999 INFO MemoryStore - Block broadcast_40 stored as values in memory (estimated size 320.0 B, free 1918.0 MiB)
18:46:12.000 INFO MemoryStore - Block broadcast_40_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.0 MiB)
18:46:12.000 INFO BlockManagerInfo - Added broadcast_40_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.7 MiB)
18:46:12.001 INFO SparkContext - Created broadcast 40 from broadcast at BamSource.java:104
18:46:12.004 INFO MemoryStore - Block broadcast_41 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
18:46:12.011 INFO MemoryStore - Block broadcast_41_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
18:46:12.011 INFO BlockManagerInfo - Added broadcast_41_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:12.012 INFO SparkContext - Created broadcast 41 from newAPIHadoopFile at PathSplitSource.java:96
18:46:12.032 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:12.032 INFO FileInputFormat - Total input files to process : 1
18:46:12.033 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:12.051 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741826_1002 replica FinalizedReplica, blk_1073741826_1002, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741826 for deletion
18:46:12.052 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741826_1002 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741826
18:46:12.057 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:12.058 INFO DAGScheduler - Got job 21 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:12.058 INFO DAGScheduler - Final stage: ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:12.058 INFO DAGScheduler - Parents of final stage: List()
18:46:12.058 INFO DAGScheduler - Missing parents: List()
18:46:12.058 INFO DAGScheduler - Submitting ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:12.067 INFO MemoryStore - Block broadcast_42 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
18:46:12.068 INFO MemoryStore - Block broadcast_42_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
18:46:12.068 INFO BlockManagerInfo - Added broadcast_42_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:12.068 INFO SparkContext - Created broadcast 42 from broadcast at DAGScheduler.scala:1580
18:46:12.069 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:12.069 INFO TaskSchedulerImpl - Adding task set 31.0 with 1 tasks resource profile 0
18:46:12.070 INFO TaskSetManager - Starting task 0.0 in stage 31.0 (TID 69) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:12.070 INFO Executor - Running task 0.0 in stage 31.0 (TID 69)
18:46:12.085 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam:0+237038
18:46:12.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:12.087 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:12.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:12.089 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:12.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:12.092 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:12.095 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:12.096 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:12.099 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:12.100 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:12.106 INFO Executor - Finished task 0.0 in stage 31.0 (TID 69). 651526 bytes result sent to driver
18:46:12.111 INFO TaskSetManager - Finished task 0.0 in stage 31.0 (TID 69) in 42 ms on localhost (executor driver) (1/1)
18:46:12.111 INFO TaskSchedulerImpl - Removed TaskSet 31.0, whose tasks have all completed, from pool
18:46:12.111 INFO DAGScheduler - ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.052 s
18:46:12.112 INFO DAGScheduler - Job 21 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:12.112 INFO TaskSchedulerImpl - Killing all running tasks in stage 31: Stage finished
18:46:12.112 INFO DAGScheduler - Job 21 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.054512 s
18:46:12.142 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:12.143 INFO DAGScheduler - Got job 22 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:12.143 INFO DAGScheduler - Final stage: ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185)
18:46:12.143 INFO DAGScheduler - Parents of final stage: List()
18:46:12.143 INFO DAGScheduler - Missing parents: List()
18:46:12.143 INFO DAGScheduler - Submitting ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:12.164 INFO MemoryStore - Block broadcast_43 stored as values in memory (estimated size 426.1 KiB, free 1917.1 MiB)
18:46:12.166 INFO MemoryStore - Block broadcast_43_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
18:46:12.166 INFO BlockManagerInfo - Added broadcast_43_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:12.166 INFO SparkContext - Created broadcast 43 from broadcast at DAGScheduler.scala:1580
18:46:12.167 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:12.167 INFO TaskSchedulerImpl - Adding task set 32.0 with 1 tasks resource profile 0
18:46:12.168 INFO TaskSetManager - Starting task 0.0 in stage 32.0 (TID 70) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:12.168 INFO Executor - Running task 0.0 in stage 32.0 (TID 70)
18:46:12.205 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:12.254 INFO Executor - Finished task 0.0 in stage 32.0 (TID 70). 989 bytes result sent to driver
18:46:12.255 INFO TaskSetManager - Finished task 0.0 in stage 32.0 (TID 70) in 88 ms on localhost (executor driver) (1/1)
18:46:12.255 INFO TaskSchedulerImpl - Removed TaskSet 32.0, whose tasks have all completed, from pool
18:46:12.255 INFO DAGScheduler - ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.111 s
18:46:12.255 INFO DAGScheduler - Job 22 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:12.255 INFO TaskSchedulerImpl - Killing all running tasks in stage 32: Stage finished
18:46:12.255 INFO DAGScheduler - Job 22 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.112740 s
18:46:12.262 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:12.262 INFO DAGScheduler - Got job 23 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:12.262 INFO DAGScheduler - Final stage: ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185)
18:46:12.262 INFO DAGScheduler - Parents of final stage: List()
18:46:12.262 INFO DAGScheduler - Missing parents: List()
18:46:12.262 INFO DAGScheduler - Submitting ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:12.270 INFO MemoryStore - Block broadcast_44 stored as values in memory (estimated size 148.1 KiB, free 1916.8 MiB)
18:46:12.271 INFO MemoryStore - Block broadcast_44_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.7 MiB)
18:46:12.272 INFO BlockManagerInfo - Added broadcast_44_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.3 MiB)
18:46:12.272 INFO SparkContext - Created broadcast 44 from broadcast at DAGScheduler.scala:1580
18:46:12.272 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:12.272 INFO TaskSchedulerImpl - Adding task set 33.0 with 1 tasks resource profile 0
18:46:12.273 INFO TaskSetManager - Starting task 0.0 in stage 33.0 (TID 71) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:12.273 INFO Executor - Running task 0.0 in stage 33.0 (TID 71)
18:46:12.288 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam:0+237038
18:46:12.289 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:12.290 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam dst=null perm=null proto=rpc
18:46:12.291 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:12.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:12.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_3c073da5-df58-4794-897b-043de0898431.bam.bai dst=null perm=null proto=rpc
18:46:12.295 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:12.298 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:12.299 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:12.302 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:12.305 INFO Executor - Finished task 0.0 in stage 33.0 (TID 71). 989 bytes result sent to driver
18:46:12.306 INFO TaskSetManager - Finished task 0.0 in stage 33.0 (TID 71) in 33 ms on localhost (executor driver) (1/1)
18:46:12.306 INFO TaskSchedulerImpl - Removed TaskSet 33.0, whose tasks have all completed, from pool
18:46:12.306 INFO DAGScheduler - ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.043 s
18:46:12.307 INFO DAGScheduler - Job 23 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:12.307 INFO TaskSchedulerImpl - Killing all running tasks in stage 33: Stage finished
18:46:12.307 INFO DAGScheduler - Job 23 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.044844 s
18:46:12.312 INFO MemoryStore - Block broadcast_45 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:12.320 INFO MemoryStore - Block broadcast_45_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:12.320 INFO BlockManagerInfo - Added broadcast_45_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:12.320 INFO SparkContext - Created broadcast 45 from newAPIHadoopFile at PathSplitSource.java:96
18:46:12.359 INFO MemoryStore - Block broadcast_46 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
18:46:12.366 INFO MemoryStore - Block broadcast_46_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
18:46:12.366 INFO BlockManagerInfo - Added broadcast_46_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:12.367 INFO SparkContext - Created broadcast 46 from newAPIHadoopFile at PathSplitSource.java:96
18:46:12.393 INFO FileInputFormat - Total input files to process : 1
18:46:12.397 INFO MemoryStore - Block broadcast_47 stored as values in memory (estimated size 160.7 KiB, free 1915.9 MiB)
18:46:12.414 INFO MemoryStore - Block broadcast_47_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.9 MiB)
18:46:12.414 INFO BlockManagerInfo - Added broadcast_47_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.2 MiB)
18:46:12.415 INFO SparkContext - Created broadcast 47 from broadcast at ReadsSparkSink.java:133
18:46:12.415 INFO BlockManagerInfo - Removed broadcast_44_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.3 MiB)
18:46:12.416 INFO BlockManagerInfo - Removed broadcast_39_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.4 MiB)
18:46:12.420 INFO BlockManagerInfo - Removed broadcast_40_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.4 MiB)
18:46:12.421 INFO MemoryStore - Block broadcast_48 stored as values in memory (estimated size 163.2 KiB, free 1916.2 MiB)
18:46:12.422 INFO BlockManagerInfo - Removed broadcast_43_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:12.424 INFO BlockManagerInfo - Removed broadcast_37_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:12.424 INFO MemoryStore - Block broadcast_48_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.9 MiB)
18:46:12.424 INFO BlockManagerInfo - Added broadcast_48_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:12.425 INFO SparkContext - Created broadcast 48 from broadcast at BamSink.java:76
18:46:12.428 INFO BlockManagerInfo - Removed broadcast_46_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:12.430 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts dst=null perm=null proto=rpc
18:46:12.431 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:12.431 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:12.431 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:12.431 INFO BlockManagerInfo - Removed broadcast_38_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.7 MiB)
18:46:12.433 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:12.435 INFO BlockManagerInfo - Removed broadcast_35_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:12.436 INFO BlockManagerInfo - Removed broadcast_41_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:12.437 INFO BlockManagerInfo - Removed broadcast_36_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:12.439 INFO BlockManagerInfo - Removed broadcast_34_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.9 MiB)
18:46:12.441 INFO BlockManagerInfo - Removed broadcast_42_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.9 MiB)
18:46:12.447 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:12.447 INFO DAGScheduler - Registering RDD 102 (mapToPair at SparkUtils.java:161) as input to shuffle 8
18:46:12.448 INFO DAGScheduler - Got job 24 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:12.448 INFO DAGScheduler - Final stage: ResultStage 35 (runJob at SparkHadoopWriter.scala:83)
18:46:12.448 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 34)
18:46:12.448 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 34)
18:46:12.448 INFO DAGScheduler - Submitting ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:12.493 INFO MemoryStore - Block broadcast_49 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
18:46:12.494 INFO MemoryStore - Block broadcast_49_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
18:46:12.495 INFO BlockManagerInfo - Added broadcast_49_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.8 MiB)
18:46:12.496 INFO SparkContext - Created broadcast 49 from broadcast at DAGScheduler.scala:1580
18:46:12.496 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:12.496 INFO TaskSchedulerImpl - Adding task set 34.0 with 1 tasks resource profile 0
18:46:12.497 INFO TaskSetManager - Starting task 0.0 in stage 34.0 (TID 72) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:12.498 INFO Executor - Running task 0.0 in stage 34.0 (TID 72)
18:46:12.554 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:12.580 INFO Executor - Finished task 0.0 in stage 34.0 (TID 72). 1148 bytes result sent to driver
18:46:12.580 INFO TaskSetManager - Finished task 0.0 in stage 34.0 (TID 72) in 83 ms on localhost (executor driver) (1/1)
18:46:12.580 INFO TaskSchedulerImpl - Removed TaskSet 34.0, whose tasks have all completed, from pool
18:46:12.581 INFO DAGScheduler - ShuffleMapStage 34 (mapToPair at SparkUtils.java:161) finished in 0.132 s
18:46:12.581 INFO DAGScheduler - looking for newly runnable stages
18:46:12.581 INFO DAGScheduler - running: HashSet()
18:46:12.581 INFO DAGScheduler - waiting: HashSet(ResultStage 35)
18:46:12.581 INFO DAGScheduler - failed: HashSet()
18:46:12.581 INFO DAGScheduler - Submitting ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91), which has no missing parents
18:46:12.590 INFO MemoryStore - Block broadcast_50 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
18:46:12.591 INFO MemoryStore - Block broadcast_50_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
18:46:12.591 INFO BlockManagerInfo - Added broadcast_50_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.7 MiB)
18:46:12.591 INFO SparkContext - Created broadcast 50 from broadcast at DAGScheduler.scala:1580
18:46:12.592 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:12.592 INFO TaskSchedulerImpl - Adding task set 35.0 with 1 tasks resource profile 0
18:46:12.593 INFO TaskSetManager - Starting task 0.0 in stage 35.0 (TID 73) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:12.593 INFO Executor - Running task 0.0 in stage 35.0 (TID 73)
18:46:12.605 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:12.605 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:12.664 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:12.664 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:12.664 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:12.664 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:12.664 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:12.664 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:12.666 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:12.667 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:12.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:12.672 INFO StateChange - BLOCK* allocate blk_1073741832_1008, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/part-r-00000
18:46:12.674 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741832_1008 src: /127.0.0.1:54724 dest: /127.0.0.1:38019
18:46:12.679 INFO clienttrace - src: /127.0.0.1:54724, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741832_1008, duration(ns): 3640139
18:46:12.679 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741832_1008, type=LAST_IN_PIPELINE terminating
18:46:12.680 INFO FSNamesystem - BLOCK* blk_1073741832_1008 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/part-r-00000
18:46:13.081 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.082 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:13.084 INFO StateChange - BLOCK* allocate blk_1073741833_1009, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/.part-r-00000.sbi
18:46:13.085 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741833_1009 src: /127.0.0.1:54740 dest: /127.0.0.1:38019
18:46:13.087 INFO clienttrace - src: /127.0.0.1:54740, dest: /127.0.0.1:38019, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741833_1009, duration(ns): 684394
18:46:13.087 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741833_1009, type=LAST_IN_PIPELINE terminating
18:46:13.088 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.090 INFO StateChange - BLOCK* allocate blk_1073741834_1010, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/.part-r-00000.bai
18:46:13.091 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741834_1010 src: /127.0.0.1:54754 dest: /127.0.0.1:38019
18:46:13.093 INFO clienttrace - src: /127.0.0.1:54754, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741834_1010, duration(ns): 484322
18:46:13.093 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741834_1010, type=LAST_IN_PIPELINE terminating
18:46:13.094 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.095 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0 dst=null perm=null proto=rpc
18:46:13.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0 dst=null perm=null proto=rpc
18:46:13.097 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/task_202505191846121096793037164092214_0107_r_000000 dst=null perm=null proto=rpc
18:46:13.098 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/_temporary/attempt_202505191846121096793037164092214_0107_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/task_202505191846121096793037164092214_0107_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:13.098 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846121096793037164092214_0107_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/task_202505191846121096793037164092214_0107_r_000000
18:46:13.098 INFO SparkHadoopMapRedUtil - attempt_202505191846121096793037164092214_0107_r_000000_0: Committed. Elapsed time: 2 ms.
18:46:13.099 INFO Executor - Finished task 0.0 in stage 35.0 (TID 73). 1858 bytes result sent to driver
18:46:13.101 INFO TaskSetManager - Finished task 0.0 in stage 35.0 (TID 73) in 509 ms on localhost (executor driver) (1/1)
18:46:13.101 INFO TaskSchedulerImpl - Removed TaskSet 35.0, whose tasks have all completed, from pool
18:46:13.101 INFO DAGScheduler - ResultStage 35 (runJob at SparkHadoopWriter.scala:83) finished in 0.519 s
18:46:13.101 INFO DAGScheduler - Job 24 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:13.102 INFO TaskSchedulerImpl - Killing all running tasks in stage 35: Stage finished
18:46:13.102 INFO DAGScheduler - Job 24 finished: runJob at SparkHadoopWriter.scala:83, took 0.654878 s
18:46:13.103 INFO SparkHadoopWriter - Start to commit write Job job_202505191846121096793037164092214_0107.
18:46:13.103 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:13.104 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts dst=null perm=null proto=rpc
18:46:13.105 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/task_202505191846121096793037164092214_0107_r_000000 dst=null perm=null proto=rpc
18:46:13.106 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:13.107 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/task_202505191846121096793037164092214_0107_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.107 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:13.108 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/task_202505191846121096793037164092214_0107_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.109 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:13.110 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary/0/task_202505191846121096793037164092214_0107_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.110 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:13.111 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.112 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.113 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.spark-staging-107 dst=null perm=null proto=rpc
18:46:13.113 INFO SparkHadoopWriter - Write Job job_202505191846121096793037164092214_0107 committed. Elapsed time: 10 ms.
18:46:13.114 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.116 INFO StateChange - BLOCK* allocate blk_1073741835_1011, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/header
18:46:13.118 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741835_1011 src: /127.0.0.1:54756 dest: /127.0.0.1:38019
18:46:13.119 INFO clienttrace - src: /127.0.0.1:54756, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741835_1011, duration(ns): 615484
18:46:13.119 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741835_1011, type=LAST_IN_PIPELINE terminating
18:46:13.120 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.121 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.123 INFO StateChange - BLOCK* allocate blk_1073741836_1012, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/terminator
18:46:13.124 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741836_1012 src: /127.0.0.1:54760 dest: /127.0.0.1:38019
18:46:13.125 INFO clienttrace - src: /127.0.0.1:54760, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741836_1012, duration(ns): 486180
18:46:13.125 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741836_1012, type=LAST_IN_PIPELINE terminating
18:46:13.126 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.127 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts dst=null perm=null proto=rpc
18:46:13.129 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.130 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.130 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam
18:46:13.131 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.131 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.132 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.132 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam done
18:46:13.133 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.133 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi
18:46:13.133 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts dst=null perm=null proto=rpc
18:46:13.134 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.135 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:13.136 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:13.137 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
18:46:13.138 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:13.140 INFO StateChange - BLOCK* allocate blk_1073741837_1013, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi
18:46:13.141 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741837_1013 src: /127.0.0.1:54776 dest: /127.0.0.1:38019
18:46:13.142 INFO clienttrace - src: /127.0.0.1:54776, dest: /127.0.0.1:38019, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741837_1013, duration(ns): 1049930
18:46:13.142 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741837_1013, type=LAST_IN_PIPELINE terminating
18:46:13.143 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.144 INFO IndexFileMerger - Done merging .sbi files
18:46:13.144 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai
18:46:13.144 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts dst=null perm=null proto=rpc
18:46:13.145 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:13.146 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:13.147 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:13.149 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:13.149 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:13.156 INFO StateChange - BLOCK* allocate blk_1073741838_1014, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai
18:46:13.157 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741838_1014 src: /127.0.0.1:54786 dest: /127.0.0.1:38019
18:46:13.158 INFO clienttrace - src: /127.0.0.1:54786, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741838_1014, duration(ns): 519938
18:46:13.158 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741838_1014, type=LAST_IN_PIPELINE terminating
18:46:13.160 INFO FSNamesystem - BLOCK* blk_1073741838_1014 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai
18:46:13.561 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:13.561 INFO IndexFileMerger - Done merging .bai files
18:46:13.562 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.parts dst=null perm=null proto=rpc
18:46:13.574 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.584 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi dst=null perm=null proto=rpc
18:46:13.585 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi dst=null perm=null proto=rpc
18:46:13.586 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi dst=null perm=null proto=rpc
18:46:13.587 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
18:46:13.588 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.591 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.592 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.593 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.594 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:13.598 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:13.598 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi dst=null perm=null proto=rpc
18:46:13.599 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi dst=null perm=null proto=rpc
18:46:13.600 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.sbi dst=null perm=null proto=rpc
18:46:13.601 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
18:46:13.602 INFO MemoryStore - Block broadcast_51 stored as values in memory (estimated size 13.3 KiB, free 1918.3 MiB)
18:46:13.602 INFO MemoryStore - Block broadcast_51_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.3 MiB)
18:46:13.603 INFO BlockManagerInfo - Added broadcast_51_piece0 in memory on localhost:45727 (size: 8.3 KiB, free: 1919.7 MiB)
18:46:13.603 INFO SparkContext - Created broadcast 51 from broadcast at BamSource.java:104
18:46:13.605 INFO MemoryStore - Block broadcast_52 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
18:46:13.612 INFO MemoryStore - Block broadcast_52_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:13.613 INFO BlockManagerInfo - Added broadcast_52_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:13.613 INFO SparkContext - Created broadcast 52 from newAPIHadoopFile at PathSplitSource.java:96
18:46:13.625 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.625 INFO FileInputFormat - Total input files to process : 1
18:46:13.626 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.648 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:13.648 INFO DAGScheduler - Got job 25 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:13.648 INFO DAGScheduler - Final stage: ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:13.648 INFO DAGScheduler - Parents of final stage: List()
18:46:13.648 INFO DAGScheduler - Missing parents: List()
18:46:13.648 INFO DAGScheduler - Submitting ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:13.660 INFO MemoryStore - Block broadcast_53 stored as values in memory (estimated size 148.2 KiB, free 1917.8 MiB)
18:46:13.661 INFO MemoryStore - Block broadcast_53_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
18:46:13.661 INFO BlockManagerInfo - Added broadcast_53_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:13.661 INFO SparkContext - Created broadcast 53 from broadcast at DAGScheduler.scala:1580
18:46:13.662 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:13.662 INFO TaskSchedulerImpl - Adding task set 36.0 with 1 tasks resource profile 0
18:46:13.663 INFO TaskSetManager - Starting task 0.0 in stage 36.0 (TID 74) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:13.663 INFO Executor - Running task 0.0 in stage 36.0 (TID 74)
18:46:13.677 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam:0+237038
18:46:13.679 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.680 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.681 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.681 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.682 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.687 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:13.687 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:13.689 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:13.690 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:13.694 INFO Executor - Finished task 0.0 in stage 36.0 (TID 74). 651526 bytes result sent to driver
18:46:13.697 INFO TaskSetManager - Finished task 0.0 in stage 36.0 (TID 74) in 35 ms on localhost (executor driver) (1/1)
18:46:13.697 INFO TaskSchedulerImpl - Removed TaskSet 36.0, whose tasks have all completed, from pool
18:46:13.697 INFO DAGScheduler - ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.048 s
18:46:13.697 INFO DAGScheduler - Job 25 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:13.697 INFO TaskSchedulerImpl - Killing all running tasks in stage 36: Stage finished
18:46:13.697 INFO DAGScheduler - Job 25 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.049496 s
18:46:13.713 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:13.713 INFO DAGScheduler - Got job 26 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:13.713 INFO DAGScheduler - Final stage: ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185)
18:46:13.713 INFO DAGScheduler - Parents of final stage: List()
18:46:13.714 INFO DAGScheduler - Missing parents: List()
18:46:13.714 INFO DAGScheduler - Submitting ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:13.739 INFO MemoryStore - Block broadcast_54 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
18:46:13.741 INFO MemoryStore - Block broadcast_54_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
18:46:13.741 INFO BlockManagerInfo - Added broadcast_54_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:13.741 INFO SparkContext - Created broadcast 54 from broadcast at DAGScheduler.scala:1580
18:46:13.742 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:13.742 INFO TaskSchedulerImpl - Adding task set 37.0 with 1 tasks resource profile 0
18:46:13.742 INFO TaskSetManager - Starting task 0.0 in stage 37.0 (TID 75) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:13.743 INFO Executor - Running task 0.0 in stage 37.0 (TID 75)
18:46:13.797 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:13.813 INFO Executor - Finished task 0.0 in stage 37.0 (TID 75). 989 bytes result sent to driver
18:46:13.814 INFO TaskSetManager - Finished task 0.0 in stage 37.0 (TID 75) in 72 ms on localhost (executor driver) (1/1)
18:46:13.814 INFO TaskSchedulerImpl - Removed TaskSet 37.0, whose tasks have all completed, from pool
18:46:13.814 INFO DAGScheduler - ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.100 s
18:46:13.814 INFO DAGScheduler - Job 26 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:13.814 INFO TaskSchedulerImpl - Killing all running tasks in stage 37: Stage finished
18:46:13.814 INFO DAGScheduler - Job 26 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.101299 s
18:46:13.818 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:13.818 INFO DAGScheduler - Got job 27 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:13.819 INFO DAGScheduler - Final stage: ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185)
18:46:13.819 INFO DAGScheduler - Parents of final stage: List()
18:46:13.819 INFO DAGScheduler - Missing parents: List()
18:46:13.819 INFO DAGScheduler - Submitting ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:13.826 INFO MemoryStore - Block broadcast_55 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
18:46:13.827 INFO MemoryStore - Block broadcast_55_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.0 MiB)
18:46:13.827 INFO BlockManagerInfo - Added broadcast_55_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.4 MiB)
18:46:13.828 INFO SparkContext - Created broadcast 55 from broadcast at DAGScheduler.scala:1580
18:46:13.828 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:13.828 INFO TaskSchedulerImpl - Adding task set 38.0 with 1 tasks resource profile 0
18:46:13.829 INFO TaskSetManager - Starting task 0.0 in stage 38.0 (TID 76) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:13.829 INFO Executor - Running task 0.0 in stage 38.0 (TID 76)
18:46:13.843 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam:0+237038
18:46:13.844 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.845 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam dst=null perm=null proto=rpc
18:46:13.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.847 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8fda55c8-cc94-4f0e-be0e-794d90520d68.bam.bai dst=null perm=null proto=rpc
18:46:13.849 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:13.852 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:13.853 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:13.856 INFO Executor - Finished task 0.0 in stage 38.0 (TID 76). 989 bytes result sent to driver
18:46:13.856 INFO TaskSetManager - Finished task 0.0 in stage 38.0 (TID 76) in 27 ms on localhost (executor driver) (1/1)
18:46:13.857 INFO TaskSchedulerImpl - Removed TaskSet 38.0, whose tasks have all completed, from pool
18:46:13.857 INFO DAGScheduler - ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.038 s
18:46:13.857 INFO DAGScheduler - Job 27 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:13.857 INFO TaskSchedulerImpl - Killing all running tasks in stage 38: Stage finished
18:46:13.857 INFO DAGScheduler - Job 27 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.038929 s
18:46:13.861 INFO MemoryStore - Block broadcast_56 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
18:46:13.868 INFO MemoryStore - Block broadcast_56_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
18:46:13.869 INFO BlockManagerInfo - Added broadcast_56_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:13.869 INFO SparkContext - Created broadcast 56 from newAPIHadoopFile at PathSplitSource.java:96
18:46:13.899 INFO MemoryStore - Block broadcast_57 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:13.906 INFO MemoryStore - Block broadcast_57_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:13.906 INFO BlockManagerInfo - Added broadcast_57_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:13.907 INFO SparkContext - Created broadcast 57 from newAPIHadoopFile at PathSplitSource.java:96
18:46:13.929 INFO FileInputFormat - Total input files to process : 1
18:46:13.932 INFO MemoryStore - Block broadcast_58 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
18:46:13.933 INFO MemoryStore - Block broadcast_58_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
18:46:13.933 INFO BlockManagerInfo - Added broadcast_58_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:13.934 INFO SparkContext - Created broadcast 58 from broadcast at ReadsSparkSink.java:133
18:46:13.934 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:13.936 INFO MemoryStore - Block broadcast_59 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
18:46:13.938 INFO MemoryStore - Block broadcast_59_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
18:46:13.938 INFO BlockManagerInfo - Added broadcast_59_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:13.938 INFO SparkContext - Created broadcast 59 from broadcast at BamSink.java:76
18:46:13.941 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts dst=null perm=null proto=rpc
18:46:13.942 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:13.942 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:13.942 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:13.943 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:13.949 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:13.950 INFO DAGScheduler - Registering RDD 127 (mapToPair at SparkUtils.java:161) as input to shuffle 9
18:46:13.950 INFO DAGScheduler - Got job 28 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:13.950 INFO DAGScheduler - Final stage: ResultStage 40 (runJob at SparkHadoopWriter.scala:83)
18:46:13.950 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 39)
18:46:13.950 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 39)
18:46:13.950 INFO DAGScheduler - Submitting ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:13.970 INFO MemoryStore - Block broadcast_60 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
18:46:13.971 INFO MemoryStore - Block broadcast_60_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.3 MiB)
18:46:13.972 INFO BlockManagerInfo - Added broadcast_60_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.1 MiB)
18:46:13.972 INFO SparkContext - Created broadcast 60 from broadcast at DAGScheduler.scala:1580
18:46:13.972 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:13.972 INFO TaskSchedulerImpl - Adding task set 39.0 with 1 tasks resource profile 0
18:46:13.973 INFO TaskSetManager - Starting task 0.0 in stage 39.0 (TID 77) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:13.974 INFO Executor - Running task 0.0 in stage 39.0 (TID 77)
18:46:14.030 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:14.062 INFO Executor - Finished task 0.0 in stage 39.0 (TID 77). 1148 bytes result sent to driver
18:46:14.062 INFO TaskSetManager - Finished task 0.0 in stage 39.0 (TID 77) in 89 ms on localhost (executor driver) (1/1)
18:46:14.062 INFO TaskSchedulerImpl - Removed TaskSet 39.0, whose tasks have all completed, from pool
18:46:14.063 INFO DAGScheduler - ShuffleMapStage 39 (mapToPair at SparkUtils.java:161) finished in 0.112 s
18:46:14.063 INFO DAGScheduler - looking for newly runnable stages
18:46:14.063 INFO DAGScheduler - running: HashSet()
18:46:14.063 INFO DAGScheduler - waiting: HashSet(ResultStage 40)
18:46:14.063 INFO DAGScheduler - failed: HashSet()
18:46:14.063 INFO DAGScheduler - Submitting ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91), which has no missing parents
18:46:14.072 INFO MemoryStore - Block broadcast_61 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
18:46:14.073 INFO MemoryStore - Block broadcast_61_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.0 MiB)
18:46:14.073 INFO BlockManagerInfo - Added broadcast_61_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.0 MiB)
18:46:14.074 INFO SparkContext - Created broadcast 61 from broadcast at DAGScheduler.scala:1580
18:46:14.074 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:14.074 INFO TaskSchedulerImpl - Adding task set 40.0 with 1 tasks resource profile 0
18:46:14.075 INFO TaskSetManager - Starting task 0.0 in stage 40.0 (TID 78) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:14.075 INFO Executor - Running task 0.0 in stage 40.0 (TID 78)
18:46:14.081 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:14.081 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:14.103 INFO BlockManagerInfo - Removed broadcast_48_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.1 MiB)
18:46:14.105 INFO BlockManagerInfo - Removed broadcast_53_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.1 MiB)
18:46:14.105 INFO BlockManagerInfo - Removed broadcast_52_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:14.106 INFO BlockManagerInfo - Removed broadcast_51_piece0 on localhost:45727 in memory (size: 8.3 KiB, free: 1919.2 MiB)
18:46:14.108 INFO BlockManagerInfo - Removed broadcast_47_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.2 MiB)
18:46:14.109 INFO BlockManagerInfo - Removed broadcast_54_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:14.110 INFO BlockManagerInfo - Removed broadcast_50_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.4 MiB)
18:46:14.110 INFO BlockManagerInfo - Removed broadcast_49_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.6 MiB)
18:46:14.111 INFO BlockManagerInfo - Removed broadcast_45_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:14.112 INFO BlockManagerInfo - Removed broadcast_60_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.8 MiB)
18:46:14.112 INFO BlockManagerInfo - Removed broadcast_55_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.8 MiB)
18:46:14.113 INFO BlockManagerInfo - Removed broadcast_57_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.9 MiB)
18:46:14.117 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:14.117 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:14.117 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:14.117 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:14.117 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:14.117 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:14.119 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:14.120 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:14.124 INFO StateChange - BLOCK* allocate blk_1073741839_1015, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/part-r-00000
18:46:14.125 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741839_1015 src: /127.0.0.1:54804 dest: /127.0.0.1:38019
18:46:14.130 INFO clienttrace - src: /127.0.0.1:54804, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741839_1015, duration(ns): 3868942
18:46:14.130 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741839_1015, type=LAST_IN_PIPELINE terminating
18:46:14.131 INFO FSNamesystem - BLOCK* blk_1073741839_1015 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/part-r-00000
18:46:14.532 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:14.533 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:14.536 INFO StateChange - BLOCK* allocate blk_1073741840_1016, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/.part-r-00000.bai
18:46:14.537 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741840_1016 src: /127.0.0.1:54812 dest: /127.0.0.1:38019
18:46:14.538 INFO clienttrace - src: /127.0.0.1:54812, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741840_1016, duration(ns): 564621
18:46:14.538 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741840_1016, type=LAST_IN_PIPELINE terminating
18:46:14.539 INFO FSNamesystem - BLOCK* blk_1073741840_1016 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/.part-r-00000.bai
18:46:14.940 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:14.942 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0 dst=null perm=null proto=rpc
18:46:14.942 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0 dst=null perm=null proto=rpc
18:46:14.943 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/task_202505191846133874433772720720390_0132_r_000000 dst=null perm=null proto=rpc
18:46:14.944 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/_temporary/attempt_202505191846133874433772720720390_0132_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/task_202505191846133874433772720720390_0132_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:14.944 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846133874433772720720390_0132_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/task_202505191846133874433772720720390_0132_r_000000
18:46:14.944 INFO SparkHadoopMapRedUtil - attempt_202505191846133874433772720720390_0132_r_000000_0: Committed. Elapsed time: 2 ms.
18:46:14.945 INFO Executor - Finished task 0.0 in stage 40.0 (TID 78). 1901 bytes result sent to driver
18:46:14.947 INFO TaskSetManager - Finished task 0.0 in stage 40.0 (TID 78) in 872 ms on localhost (executor driver) (1/1)
18:46:14.947 INFO TaskSchedulerImpl - Removed TaskSet 40.0, whose tasks have all completed, from pool
18:46:14.948 INFO DAGScheduler - ResultStage 40 (runJob at SparkHadoopWriter.scala:83) finished in 0.884 s
18:46:14.948 INFO DAGScheduler - Job 28 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:14.948 INFO TaskSchedulerImpl - Killing all running tasks in stage 40: Stage finished
18:46:14.948 INFO DAGScheduler - Job 28 finished: runJob at SparkHadoopWriter.scala:83, took 0.998957 s
18:46:14.949 INFO SparkHadoopWriter - Start to commit write Job job_202505191846133874433772720720390_0132.
18:46:14.950 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:14.950 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts dst=null perm=null proto=rpc
18:46:14.951 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/task_202505191846133874433772720720390_0132_r_000000 dst=null perm=null proto=rpc
18:46:14.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:14.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/task_202505191846133874433772720720390_0132_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:14.953 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:14.954 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary/0/task_202505191846133874433772720720390_0132_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:14.954 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:14.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:14.956 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:14.957 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/.spark-staging-132 dst=null perm=null proto=rpc
18:46:14.957 INFO SparkHadoopWriter - Write Job job_202505191846133874433772720720390_0132 committed. Elapsed time: 8 ms.
18:46:14.958 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:14.960 INFO StateChange - BLOCK* allocate blk_1073741841_1017, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/header
18:46:14.961 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741841_1017 src: /127.0.0.1:54822 dest: /127.0.0.1:38019
18:46:14.963 INFO clienttrace - src: /127.0.0.1:54822, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741841_1017, duration(ns): 621145
18:46:14.963 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741841_1017, type=LAST_IN_PIPELINE terminating
18:46:14.964 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:14.965 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:14.966 INFO StateChange - BLOCK* allocate blk_1073741842_1018, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/terminator
18:46:14.967 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741842_1018 src: /127.0.0.1:54832 dest: /127.0.0.1:38019
18:46:14.968 INFO clienttrace - src: /127.0.0.1:54832, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741842_1018, duration(ns): 453095
18:46:14.968 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741842_1018, type=LAST_IN_PIPELINE terminating
18:46:14.968 INFO FSNamesystem - BLOCK* blk_1073741842_1018 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/terminator
18:46:15.045 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741827_1003 replica FinalizedReplica, blk_1073741827_1003, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741827 for deletion
18:46:15.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741833_1009 replica FinalizedReplica, blk_1073741833_1009, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741833 for deletion
18:46:15.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741834_1010 replica FinalizedReplica, blk_1073741834_1010, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741834 for deletion
18:46:15.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741827_1003 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741827
18:46:15.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741834_1010 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741834
18:46:15.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741833_1009 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741833
18:46:15.370 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:15.371 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts dst=null perm=null proto=rpc
18:46:15.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:15.373 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:15.374 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam
18:46:15.374 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:15.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.376 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:15.376 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam done
18:46:15.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.377 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai
18:46:15.378 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts dst=null perm=null proto=rpc
18:46:15.379 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:15.380 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:15.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:15.383 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:15.383 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:15.385 INFO StateChange - BLOCK* allocate blk_1073741843_1019, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai
18:46:15.386 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741843_1019 src: /127.0.0.1:54844 dest: /127.0.0.1:38019
18:46:15.388 INFO clienttrace - src: /127.0.0.1:54844, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741843_1019, duration(ns): 549997
18:46:15.388 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741843_1019, type=LAST_IN_PIPELINE terminating
18:46:15.388 INFO FSNamesystem - BLOCK* blk_1073741843_1019 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai
18:46:15.790 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:15.790 INFO IndexFileMerger - Done merging .bai files
18:46:15.791 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.parts dst=null perm=null proto=rpc
18:46:15.800 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:15.801 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.802 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.802 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.803 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.804 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:15.805 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:15.805 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:15.807 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:15.810 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:15.810 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.sbi dst=null perm=null proto=rpc
18:46:15.812 INFO MemoryStore - Block broadcast_62 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
18:46:15.818 INFO MemoryStore - Block broadcast_62_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
18:46:15.819 INFO BlockManagerInfo - Added broadcast_62_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:15.819 INFO SparkContext - Created broadcast 62 from newAPIHadoopFile at PathSplitSource.java:96
18:46:15.841 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.841 INFO FileInputFormat - Total input files to process : 1
18:46:15.842 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.879 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:15.879 INFO DAGScheduler - Got job 29 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:15.879 INFO DAGScheduler - Final stage: ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:15.880 INFO DAGScheduler - Parents of final stage: List()
18:46:15.880 INFO DAGScheduler - Missing parents: List()
18:46:15.880 INFO DAGScheduler - Submitting ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:15.898 INFO MemoryStore - Block broadcast_63 stored as values in memory (estimated size 426.2 KiB, free 1918.3 MiB)
18:46:15.899 INFO MemoryStore - Block broadcast_63_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1918.1 MiB)
18:46:15.899 INFO BlockManagerInfo - Added broadcast_63_piece0 in memory on localhost:45727 (size: 153.7 KiB, free: 1919.7 MiB)
18:46:15.900 INFO SparkContext - Created broadcast 63 from broadcast at DAGScheduler.scala:1580
18:46:15.900 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:15.900 INFO TaskSchedulerImpl - Adding task set 41.0 with 1 tasks resource profile 0
18:46:15.901 INFO TaskSetManager - Starting task 0.0 in stage 41.0 (TID 79) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:15.901 INFO Executor - Running task 0.0 in stage 41.0 (TID 79)
18:46:15.935 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam:0+237038
18:46:15.936 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.937 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.939 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:15.939 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.940 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.941 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:15.941 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:15.942 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:15.944 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:15.947 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:15.948 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:15.948 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.949 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:15.950 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.950 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:15.957 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.958 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.959 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.960 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.961 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.962 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.963 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.963 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.964 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.965 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.968 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.969 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.969 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.972 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.973 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.976 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.977 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.978 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.979 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.980 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.981 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.982 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.982 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.984 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.985 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.986 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.987 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.988 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.989 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.990 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.990 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.991 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.994 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.995 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.996 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.997 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:15.998 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:15.998 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.001 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.002 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.003 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.004 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.004 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.005 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.008 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.009 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.010 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.013 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.014 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.015 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.016 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.017 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.018 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.019 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.020 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.021 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.023 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:16.025 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:16.028 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.028 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:16.032 INFO Executor - Finished task 0.0 in stage 41.0 (TID 79). 651526 bytes result sent to driver
18:46:16.034 INFO TaskSetManager - Finished task 0.0 in stage 41.0 (TID 79) in 133 ms on localhost (executor driver) (1/1)
18:46:16.034 INFO TaskSchedulerImpl - Removed TaskSet 41.0, whose tasks have all completed, from pool
18:46:16.035 INFO DAGScheduler - ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.154 s
18:46:16.035 INFO DAGScheduler - Job 29 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:16.035 INFO TaskSchedulerImpl - Killing all running tasks in stage 41: Stage finished
18:46:16.035 INFO DAGScheduler - Job 29 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.155927 s
18:46:16.057 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:16.058 INFO DAGScheduler - Got job 30 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:16.058 INFO DAGScheduler - Final stage: ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185)
18:46:16.058 INFO DAGScheduler - Parents of final stage: List()
18:46:16.058 INFO DAGScheduler - Missing parents: List()
18:46:16.058 INFO DAGScheduler - Submitting ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:16.081 INFO MemoryStore - Block broadcast_64 stored as values in memory (estimated size 426.1 KiB, free 1917.7 MiB)
18:46:16.083 INFO MemoryStore - Block broadcast_64_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
18:46:16.083 INFO BlockManagerInfo - Added broadcast_64_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:16.083 INFO SparkContext - Created broadcast 64 from broadcast at DAGScheduler.scala:1580
18:46:16.084 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:16.084 INFO TaskSchedulerImpl - Adding task set 42.0 with 1 tasks resource profile 0
18:46:16.085 INFO TaskSetManager - Starting task 0.0 in stage 42.0 (TID 80) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:16.085 INFO Executor - Running task 0.0 in stage 42.0 (TID 80)
18:46:16.117 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:16.129 INFO Executor - Finished task 0.0 in stage 42.0 (TID 80). 989 bytes result sent to driver
18:46:16.129 INFO TaskSetManager - Finished task 0.0 in stage 42.0 (TID 80) in 44 ms on localhost (executor driver) (1/1)
18:46:16.129 INFO TaskSchedulerImpl - Removed TaskSet 42.0, whose tasks have all completed, from pool
18:46:16.129 INFO DAGScheduler - ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.071 s
18:46:16.129 INFO DAGScheduler - Job 30 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:16.129 INFO TaskSchedulerImpl - Killing all running tasks in stage 42: Stage finished
18:46:16.129 INFO DAGScheduler - Job 30 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.072234 s
18:46:16.133 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:16.134 INFO DAGScheduler - Got job 31 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:16.134 INFO DAGScheduler - Final stage: ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185)
18:46:16.134 INFO DAGScheduler - Parents of final stage: List()
18:46:16.134 INFO DAGScheduler - Missing parents: List()
18:46:16.134 INFO DAGScheduler - Submitting ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:16.159 INFO MemoryStore - Block broadcast_65 stored as values in memory (estimated size 426.1 KiB, free 1917.1 MiB)
18:46:16.161 INFO MemoryStore - Block broadcast_65_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.0 MiB)
18:46:16.161 INFO BlockManagerInfo - Added broadcast_65_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:16.161 INFO SparkContext - Created broadcast 65 from broadcast at DAGScheduler.scala:1580
18:46:16.161 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:16.161 INFO TaskSchedulerImpl - Adding task set 43.0 with 1 tasks resource profile 0
18:46:16.162 INFO TaskSetManager - Starting task 0.0 in stage 43.0 (TID 81) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:16.162 INFO Executor - Running task 0.0 in stage 43.0 (TID 81)
18:46:16.195 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam:0+237038
18:46:16.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.199 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:16.200 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.201 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.202 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.203 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.203 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.205 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:16.208 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:16.209 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:16.209 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.210 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.211 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.212 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:16.218 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.219 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.220 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.221 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.222 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.223 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.224 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.225 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.226 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.227 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.229 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.230 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.231 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.232 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.233 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.234 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.235 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.235 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.236 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.237 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.238 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.239 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.240 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.241 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.241 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.243 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.244 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.246 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.247 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.248 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.249 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.250 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.251 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.252 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.253 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.254 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.255 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.256 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.257 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.258 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.258 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.261 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.262 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.263 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.264 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.265 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.266 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.267 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.268 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.269 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.270 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.272 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.272 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.273 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.274 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.275 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.276 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:16.277 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.278 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam dst=null perm=null proto=rpc
18:46:16.280 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.280 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.281 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_f8b25574-e751-40a5-a739-d031d6be90af.bam.bai dst=null perm=null proto=rpc
18:46:16.284 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:16.286 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:16.286 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:16.288 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:16.288 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:16.290 INFO Executor - Finished task 0.0 in stage 43.0 (TID 81). 989 bytes result sent to driver
18:46:16.291 INFO TaskSetManager - Finished task 0.0 in stage 43.0 (TID 81) in 129 ms on localhost (executor driver) (1/1)
18:46:16.291 INFO TaskSchedulerImpl - Removed TaskSet 43.0, whose tasks have all completed, from pool
18:46:16.291 INFO DAGScheduler - ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.157 s
18:46:16.291 INFO DAGScheduler - Job 31 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:16.291 INFO TaskSchedulerImpl - Killing all running tasks in stage 43: Stage finished
18:46:16.291 INFO DAGScheduler - Job 31 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.157977 s
18:46:16.295 INFO MemoryStore - Block broadcast_66 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
18:46:16.302 INFO MemoryStore - Block broadcast_66_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
18:46:16.302 INFO BlockManagerInfo - Added broadcast_66_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:16.302 INFO SparkContext - Created broadcast 66 from newAPIHadoopFile at PathSplitSource.java:96
18:46:16.325 INFO MemoryStore - Block broadcast_67 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:16.332 INFO MemoryStore - Block broadcast_67_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.3 MiB)
18:46:16.332 INFO BlockManagerInfo - Added broadcast_67_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:16.332 INFO SparkContext - Created broadcast 67 from newAPIHadoopFile at PathSplitSource.java:96
18:46:16.353 INFO FileInputFormat - Total input files to process : 1
18:46:16.356 INFO MemoryStore - Block broadcast_68 stored as values in memory (estimated size 160.7 KiB, free 1916.1 MiB)
18:46:16.357 INFO MemoryStore - Block broadcast_68_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.1 MiB)
18:46:16.357 INFO BlockManagerInfo - Added broadcast_68_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:16.358 INFO SparkContext - Created broadcast 68 from broadcast at ReadsSparkSink.java:133
18:46:16.358 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:16.359 INFO MemoryStore - Block broadcast_69 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
18:46:16.360 INFO MemoryStore - Block broadcast_69_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
18:46:16.360 INFO BlockManagerInfo - Added broadcast_69_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:16.361 INFO SparkContext - Created broadcast 69 from broadcast at BamSink.java:76
18:46:16.363 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts dst=null perm=null proto=rpc
18:46:16.364 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:16.364 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:16.364 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:16.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:16.371 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:16.372 INFO DAGScheduler - Registering RDD 153 (mapToPair at SparkUtils.java:161) as input to shuffle 10
18:46:16.372 INFO DAGScheduler - Got job 32 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:16.372 INFO DAGScheduler - Final stage: ResultStage 45 (runJob at SparkHadoopWriter.scala:83)
18:46:16.372 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 44)
18:46:16.372 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 44)
18:46:16.372 INFO DAGScheduler - Submitting ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:16.391 INFO MemoryStore - Block broadcast_70 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
18:46:16.392 INFO MemoryStore - Block broadcast_70_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.3 MiB)
18:46:16.393 INFO BlockManagerInfo - Added broadcast_70_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.1 MiB)
18:46:16.393 INFO SparkContext - Created broadcast 70 from broadcast at DAGScheduler.scala:1580
18:46:16.393 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:16.393 INFO TaskSchedulerImpl - Adding task set 44.0 with 1 tasks resource profile 0
18:46:16.394 INFO TaskSetManager - Starting task 0.0 in stage 44.0 (TID 82) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:16.395 INFO Executor - Running task 0.0 in stage 44.0 (TID 82)
18:46:16.429 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:16.443 INFO BlockManagerInfo - Removed broadcast_67_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:16.444 INFO BlockManagerInfo - Removed broadcast_61_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.2 MiB)
18:46:16.445 INFO BlockManagerInfo - Removed broadcast_65_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:16.447 INFO BlockManagerInfo - Removed broadcast_64_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:16.448 INFO BlockManagerInfo - Removed broadcast_56_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:16.451 INFO BlockManagerInfo - Removed broadcast_62_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:16.451 INFO BlockManagerInfo - Removed broadcast_63_piece0 on localhost:45727 in memory (size: 153.7 KiB, free: 1919.8 MiB)
18:46:16.452 INFO BlockManagerInfo - Removed broadcast_58_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:16.453 INFO BlockManagerInfo - Removed broadcast_59_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:16.470 INFO Executor - Finished task 0.0 in stage 44.0 (TID 82). 1191 bytes result sent to driver
18:46:16.471 INFO TaskSetManager - Finished task 0.0 in stage 44.0 (TID 82) in 76 ms on localhost (executor driver) (1/1)
18:46:16.471 INFO TaskSchedulerImpl - Removed TaskSet 44.0, whose tasks have all completed, from pool
18:46:16.471 INFO DAGScheduler - ShuffleMapStage 44 (mapToPair at SparkUtils.java:161) finished in 0.098 s
18:46:16.471 INFO DAGScheduler - looking for newly runnable stages
18:46:16.471 INFO DAGScheduler - running: HashSet()
18:46:16.471 INFO DAGScheduler - waiting: HashSet(ResultStage 45)
18:46:16.471 INFO DAGScheduler - failed: HashSet()
18:46:16.472 INFO DAGScheduler - Submitting ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91), which has no missing parents
18:46:16.484 INFO MemoryStore - Block broadcast_71 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
18:46:16.485 INFO MemoryStore - Block broadcast_71_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
18:46:16.486 INFO BlockManagerInfo - Added broadcast_71_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.7 MiB)
18:46:16.486 INFO SparkContext - Created broadcast 71 from broadcast at DAGScheduler.scala:1580
18:46:16.486 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:16.486 INFO TaskSchedulerImpl - Adding task set 45.0 with 1 tasks resource profile 0
18:46:16.487 INFO TaskSetManager - Starting task 0.0 in stage 45.0 (TID 83) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:16.488 INFO Executor - Running task 0.0 in stage 45.0 (TID 83)
18:46:16.496 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:16.496 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:16.521 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:16.521 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:16.521 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:16.521 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:16.521 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:16.521 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:16.523 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:16.524 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:16.527 INFO StateChange - BLOCK* allocate blk_1073741844_1020, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/part-r-00000
18:46:16.529 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741844_1020 src: /127.0.0.1:55490 dest: /127.0.0.1:38019
18:46:16.532 INFO clienttrace - src: /127.0.0.1:55490, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741844_1020, duration(ns): 2566685
18:46:16.532 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741844_1020, type=LAST_IN_PIPELINE terminating
18:46:16.533 INFO FSNamesystem - BLOCK* blk_1073741844_1020 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/part-r-00000
18:46:16.934 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:16.935 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:16.936 INFO StateChange - BLOCK* allocate blk_1073741845_1021, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/.part-r-00000.sbi
18:46:16.937 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741845_1021 src: /127.0.0.1:55492 dest: /127.0.0.1:38019
18:46:16.939 INFO clienttrace - src: /127.0.0.1:55492, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741845_1021, duration(ns): 474145
18:46:16.939 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741845_1021, type=LAST_IN_PIPELINE terminating
18:46:16.940 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:16.940 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0 dst=null perm=null proto=rpc
18:46:16.941 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0 dst=null perm=null proto=rpc
18:46:16.942 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/task_20250519184616735180775002307797_0158_r_000000 dst=null perm=null proto=rpc
18:46:16.943 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/_temporary/attempt_20250519184616735180775002307797_0158_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/task_20250519184616735180775002307797_0158_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:16.943 INFO FileOutputCommitter - Saved output of task 'attempt_20250519184616735180775002307797_0158_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/task_20250519184616735180775002307797_0158_r_000000
18:46:16.943 INFO SparkHadoopMapRedUtil - attempt_20250519184616735180775002307797_0158_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:16.944 INFO Executor - Finished task 0.0 in stage 45.0 (TID 83). 1858 bytes result sent to driver
18:46:16.944 INFO TaskSetManager - Finished task 0.0 in stage 45.0 (TID 83) in 457 ms on localhost (executor driver) (1/1)
18:46:16.944 INFO TaskSchedulerImpl - Removed TaskSet 45.0, whose tasks have all completed, from pool
18:46:16.945 INFO DAGScheduler - ResultStage 45 (runJob at SparkHadoopWriter.scala:83) finished in 0.473 s
18:46:16.945 INFO DAGScheduler - Job 32 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:16.945 INFO TaskSchedulerImpl - Killing all running tasks in stage 45: Stage finished
18:46:16.945 INFO DAGScheduler - Job 32 finished: runJob at SparkHadoopWriter.scala:83, took 0.573762 s
18:46:16.946 INFO SparkHadoopWriter - Start to commit write Job job_20250519184616735180775002307797_0158.
18:46:16.946 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:16.947 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts dst=null perm=null proto=rpc
18:46:16.948 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/task_20250519184616735180775002307797_0158_r_000000 dst=null perm=null proto=rpc
18:46:16.948 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:16.949 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/task_20250519184616735180775002307797_0158_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:16.950 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:16.951 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary/0/task_20250519184616735180775002307797_0158_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:16.951 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:16.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:16.953 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:16.954 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/.spark-staging-158 dst=null perm=null proto=rpc
18:46:16.954 INFO SparkHadoopWriter - Write Job job_20250519184616735180775002307797_0158 committed. Elapsed time: 8 ms.
18:46:16.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:16.956 INFO StateChange - BLOCK* allocate blk_1073741846_1022, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/header
18:46:16.957 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741846_1022 src: /127.0.0.1:55500 dest: /127.0.0.1:38019
18:46:16.959 INFO clienttrace - src: /127.0.0.1:55500, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741846_1022, duration(ns): 495445
18:46:16.959 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741846_1022, type=LAST_IN_PIPELINE terminating
18:46:16.959 INFO FSNamesystem - BLOCK* blk_1073741846_1022 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/header
18:46:17.360 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:17.362 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:17.363 INFO StateChange - BLOCK* allocate blk_1073741847_1023, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/terminator
18:46:17.364 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741847_1023 src: /127.0.0.1:55510 dest: /127.0.0.1:38019
18:46:17.365 INFO clienttrace - src: /127.0.0.1:55510, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741847_1023, duration(ns): 461133
18:46:17.365 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741847_1023, type=LAST_IN_PIPELINE terminating
18:46:17.366 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:17.367 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts dst=null perm=null proto=rpc
18:46:17.368 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:17.369 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:17.369 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam
18:46:17.370 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:17.370 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.371 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:17.371 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam done
18:46:17.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.372 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi
18:46:17.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts dst=null perm=null proto=rpc
18:46:17.373 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:17.374 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:17.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:17.376 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:17.376 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:17.377 INFO StateChange - BLOCK* allocate blk_1073741848_1024, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi
18:46:17.378 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741848_1024 src: /127.0.0.1:55512 dest: /127.0.0.1:38019
18:46:17.380 INFO clienttrace - src: /127.0.0.1:55512, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741848_1024, duration(ns): 459899
18:46:17.380 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741848_1024, type=LAST_IN_PIPELINE terminating
18:46:17.381 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:17.381 INFO IndexFileMerger - Done merging .sbi files
18:46:17.382 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.parts dst=null perm=null proto=rpc
18:46:17.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi dst=null perm=null proto=rpc
18:46:17.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi dst=null perm=null proto=rpc
18:46:17.393 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi dst=null perm=null proto=rpc
18:46:17.394 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:17.394 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.395 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.395 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.396 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.397 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.bai dst=null perm=null proto=rpc
18:46:17.397 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bai dst=null perm=null proto=rpc
18:46:17.398 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:17.400 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi dst=null perm=null proto=rpc
18:46:17.400 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi dst=null perm=null proto=rpc
18:46:17.401 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.sbi dst=null perm=null proto=rpc
18:46:17.402 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:17.402 INFO MemoryStore - Block broadcast_72 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
18:46:17.403 INFO MemoryStore - Block broadcast_72_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
18:46:17.403 INFO BlockManagerInfo - Added broadcast_72_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.7 MiB)
18:46:17.404 INFO SparkContext - Created broadcast 72 from broadcast at BamSource.java:104
18:46:17.405 INFO MemoryStore - Block broadcast_73 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:17.416 INFO MemoryStore - Block broadcast_73_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:17.416 INFO BlockManagerInfo - Added broadcast_73_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:17.416 INFO SparkContext - Created broadcast 73 from newAPIHadoopFile at PathSplitSource.java:96
18:46:17.432 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.432 INFO FileInputFormat - Total input files to process : 1
18:46:17.433 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.454 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:17.454 INFO DAGScheduler - Got job 33 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:17.454 INFO DAGScheduler - Final stage: ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:17.454 INFO DAGScheduler - Parents of final stage: List()
18:46:17.454 INFO DAGScheduler - Missing parents: List()
18:46:17.454 INFO DAGScheduler - Submitting ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:17.461 INFO MemoryStore - Block broadcast_74 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
18:46:17.462 INFO MemoryStore - Block broadcast_74_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
18:46:17.462 INFO BlockManagerInfo - Added broadcast_74_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:17.462 INFO SparkContext - Created broadcast 74 from broadcast at DAGScheduler.scala:1580
18:46:17.462 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:17.462 INFO TaskSchedulerImpl - Adding task set 46.0 with 1 tasks resource profile 0
18:46:17.463 INFO TaskSetManager - Starting task 0.0 in stage 46.0 (TID 84) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:17.464 INFO Executor - Running task 0.0 in stage 46.0 (TID 84)
18:46:17.476 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam:0+237038
18:46:17.477 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.478 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.479 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.bai dst=null perm=null proto=rpc
18:46:17.479 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bai dst=null perm=null proto=rpc
18:46:17.481 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:17.483 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:17.486 INFO Executor - Finished task 0.0 in stage 46.0 (TID 84). 651526 bytes result sent to driver
18:46:17.488 INFO TaskSetManager - Finished task 0.0 in stage 46.0 (TID 84) in 25 ms on localhost (executor driver) (1/1)
18:46:17.488 INFO TaskSchedulerImpl - Removed TaskSet 46.0, whose tasks have all completed, from pool
18:46:17.488 INFO DAGScheduler - ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.033 s
18:46:17.489 INFO DAGScheduler - Job 33 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:17.489 INFO TaskSchedulerImpl - Killing all running tasks in stage 46: Stage finished
18:46:17.489 INFO DAGScheduler - Job 33 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.035041 s
18:46:17.499 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:17.500 INFO DAGScheduler - Got job 34 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:17.500 INFO DAGScheduler - Final stage: ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185)
18:46:17.500 INFO DAGScheduler - Parents of final stage: List()
18:46:17.500 INFO DAGScheduler - Missing parents: List()
18:46:17.500 INFO DAGScheduler - Submitting ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:17.518 INFO MemoryStore - Block broadcast_75 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
18:46:17.519 INFO MemoryStore - Block broadcast_75_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
18:46:17.519 INFO BlockManagerInfo - Added broadcast_75_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:17.520 INFO SparkContext - Created broadcast 75 from broadcast at DAGScheduler.scala:1580
18:46:17.520 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:17.520 INFO TaskSchedulerImpl - Adding task set 47.0 with 1 tasks resource profile 0
18:46:17.521 INFO TaskSetManager - Starting task 0.0 in stage 47.0 (TID 85) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:17.521 INFO Executor - Running task 0.0 in stage 47.0 (TID 85)
18:46:17.554 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:17.566 INFO Executor - Finished task 0.0 in stage 47.0 (TID 85). 989 bytes result sent to driver
18:46:17.567 INFO TaskSetManager - Finished task 0.0 in stage 47.0 (TID 85) in 46 ms on localhost (executor driver) (1/1)
18:46:17.567 INFO TaskSchedulerImpl - Removed TaskSet 47.0, whose tasks have all completed, from pool
18:46:17.567 INFO DAGScheduler - ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.066 s
18:46:17.567 INFO DAGScheduler - Job 34 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:17.567 INFO TaskSchedulerImpl - Killing all running tasks in stage 47: Stage finished
18:46:17.567 INFO DAGScheduler - Job 34 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.067599 s
18:46:17.571 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:17.572 INFO DAGScheduler - Got job 35 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:17.572 INFO DAGScheduler - Final stage: ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185)
18:46:17.572 INFO DAGScheduler - Parents of final stage: List()
18:46:17.572 INFO DAGScheduler - Missing parents: List()
18:46:17.572 INFO DAGScheduler - Submitting ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:17.579 INFO MemoryStore - Block broadcast_76 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
18:46:17.580 INFO MemoryStore - Block broadcast_76_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
18:46:17.580 INFO BlockManagerInfo - Added broadcast_76_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.4 MiB)
18:46:17.580 INFO SparkContext - Created broadcast 76 from broadcast at DAGScheduler.scala:1580
18:46:17.580 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:17.580 INFO TaskSchedulerImpl - Adding task set 48.0 with 1 tasks resource profile 0
18:46:17.581 INFO TaskSetManager - Starting task 0.0 in stage 48.0 (TID 86) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:17.582 INFO Executor - Running task 0.0 in stage 48.0 (TID 86)
18:46:17.599 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam:0+237038
18:46:17.600 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.601 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam dst=null perm=null proto=rpc
18:46:17.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bam.bai dst=null perm=null proto=rpc
18:46:17.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_ceb6a4b9-a6e0-44d2-a1a1-74e165aae7d4.bai dst=null perm=null proto=rpc
18:46:17.604 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:17.607 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:17.608 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:17.610 INFO Executor - Finished task 0.0 in stage 48.0 (TID 86). 989 bytes result sent to driver
18:46:17.610 INFO TaskSetManager - Finished task 0.0 in stage 48.0 (TID 86) in 29 ms on localhost (executor driver) (1/1)
18:46:17.610 INFO TaskSchedulerImpl - Removed TaskSet 48.0, whose tasks have all completed, from pool
18:46:17.610 INFO DAGScheduler - ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.038 s
18:46:17.611 INFO DAGScheduler - Job 35 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:17.611 INFO TaskSchedulerImpl - Killing all running tasks in stage 48: Stage finished
18:46:17.611 INFO DAGScheduler - Job 35 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.039528 s
18:46:17.615 INFO MemoryStore - Block broadcast_77 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
18:46:17.621 INFO MemoryStore - Block broadcast_77_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
18:46:17.621 INFO BlockManagerInfo - Added broadcast_77_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:17.622 INFO SparkContext - Created broadcast 77 from newAPIHadoopFile at PathSplitSource.java:96
18:46:17.644 INFO MemoryStore - Block broadcast_78 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:17.651 INFO MemoryStore - Block broadcast_78_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:17.651 INFO BlockManagerInfo - Added broadcast_78_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:17.651 INFO SparkContext - Created broadcast 78 from newAPIHadoopFile at PathSplitSource.java:96
18:46:17.672 INFO FileInputFormat - Total input files to process : 1
18:46:17.674 INFO MemoryStore - Block broadcast_79 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
18:46:17.675 INFO MemoryStore - Block broadcast_79_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
18:46:17.675 INFO BlockManagerInfo - Added broadcast_79_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:17.676 INFO SparkContext - Created broadcast 79 from broadcast at ReadsSparkSink.java:133
18:46:17.676 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:17.676 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:17.677 INFO MemoryStore - Block broadcast_80 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
18:46:17.678 INFO MemoryStore - Block broadcast_80_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
18:46:17.678 INFO BlockManagerInfo - Added broadcast_80_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:17.679 INFO SparkContext - Created broadcast 80 from broadcast at BamSink.java:76
18:46:17.681 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts dst=null perm=null proto=rpc
18:46:17.681 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:17.681 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:17.681 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:17.682 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:17.693 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:17.693 INFO DAGScheduler - Registering RDD 178 (mapToPair at SparkUtils.java:161) as input to shuffle 11
18:46:17.694 INFO DAGScheduler - Got job 36 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:17.694 INFO DAGScheduler - Final stage: ResultStage 50 (runJob at SparkHadoopWriter.scala:83)
18:46:17.694 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 49)
18:46:17.694 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 49)
18:46:17.694 INFO DAGScheduler - Submitting ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:17.712 INFO MemoryStore - Block broadcast_81 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
18:46:17.714 INFO MemoryStore - Block broadcast_81_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
18:46:17.714 INFO BlockManagerInfo - Added broadcast_81_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.1 MiB)
18:46:17.714 INFO SparkContext - Created broadcast 81 from broadcast at DAGScheduler.scala:1580
18:46:17.714 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:17.715 INFO TaskSchedulerImpl - Adding task set 49.0 with 1 tasks resource profile 0
18:46:17.715 INFO TaskSetManager - Starting task 0.0 in stage 49.0 (TID 87) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:17.716 INFO Executor - Running task 0.0 in stage 49.0 (TID 87)
18:46:17.747 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:17.765 INFO Executor - Finished task 0.0 in stage 49.0 (TID 87). 1148 bytes result sent to driver
18:46:17.766 INFO TaskSetManager - Finished task 0.0 in stage 49.0 (TID 87) in 51 ms on localhost (executor driver) (1/1)
18:46:17.766 INFO TaskSchedulerImpl - Removed TaskSet 49.0, whose tasks have all completed, from pool
18:46:17.766 INFO DAGScheduler - ShuffleMapStage 49 (mapToPair at SparkUtils.java:161) finished in 0.072 s
18:46:17.766 INFO DAGScheduler - looking for newly runnable stages
18:46:17.766 INFO DAGScheduler - running: HashSet()
18:46:17.766 INFO DAGScheduler - waiting: HashSet(ResultStage 50)
18:46:17.766 INFO DAGScheduler - failed: HashSet()
18:46:17.766 INFO DAGScheduler - Submitting ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91), which has no missing parents
18:46:17.778 INFO MemoryStore - Block broadcast_82 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
18:46:17.779 INFO MemoryStore - Block broadcast_82_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.1 MiB)
18:46:17.779 INFO BlockManagerInfo - Added broadcast_82_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.1 MiB)
18:46:17.780 INFO SparkContext - Created broadcast 82 from broadcast at DAGScheduler.scala:1580
18:46:17.780 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:17.780 INFO TaskSchedulerImpl - Adding task set 50.0 with 1 tasks resource profile 0
18:46:17.781 INFO TaskSetManager - Starting task 0.0 in stage 50.0 (TID 88) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:17.781 INFO Executor - Running task 0.0 in stage 50.0 (TID 88)
18:46:17.786 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:17.786 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:17.801 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:17.801 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:17.801 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:17.801 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:17.801 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:17.801 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:17.803 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:17.807 INFO StateChange - BLOCK* allocate blk_1073741849_1025, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0/part-r-00000
18:46:17.808 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741849_1025 src: /127.0.0.1:55528 dest: /127.0.0.1:38019
18:46:17.811 INFO clienttrace - src: /127.0.0.1:55528, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741849_1025, duration(ns): 2338255
18:46:17.812 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741849_1025, type=LAST_IN_PIPELINE terminating
18:46:17.813 INFO FSNamesystem - BLOCK* blk_1073741849_1025 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0/part-r-00000
18:46:18.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741840_1016 replica FinalizedReplica, blk_1073741840_1016, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741840 for deletion
18:46:18.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741845_1021 replica FinalizedReplica, blk_1073741845_1021, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741845 for deletion
18:46:18.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741840_1016 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741840
18:46:18.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741845_1021 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741845
18:46:18.214 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:18.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:18.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0 dst=null perm=null proto=rpc
18:46:18.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0 dst=null perm=null proto=rpc
18:46:18.217 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/task_20250519184617548622401951361242_0183_r_000000 dst=null perm=null proto=rpc
18:46:18.218 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/_temporary/attempt_20250519184617548622401951361242_0183_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/task_20250519184617548622401951361242_0183_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:18.218 INFO FileOutputCommitter - Saved output of task 'attempt_20250519184617548622401951361242_0183_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/task_20250519184617548622401951361242_0183_r_000000
18:46:18.218 INFO SparkHadoopMapRedUtil - attempt_20250519184617548622401951361242_0183_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:18.219 INFO Executor - Finished task 0.0 in stage 50.0 (TID 88). 1858 bytes result sent to driver
18:46:18.219 INFO TaskSetManager - Finished task 0.0 in stage 50.0 (TID 88) in 438 ms on localhost (executor driver) (1/1)
18:46:18.220 INFO TaskSchedulerImpl - Removed TaskSet 50.0, whose tasks have all completed, from pool
18:46:18.220 INFO DAGScheduler - ResultStage 50 (runJob at SparkHadoopWriter.scala:83) finished in 0.453 s
18:46:18.220 INFO DAGScheduler - Job 36 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:18.220 INFO TaskSchedulerImpl - Killing all running tasks in stage 50: Stage finished
18:46:18.220 INFO DAGScheduler - Job 36 finished: runJob at SparkHadoopWriter.scala:83, took 0.527325 s
18:46:18.221 INFO SparkHadoopWriter - Start to commit write Job job_20250519184617548622401951361242_0183.
18:46:18.221 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:18.222 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts dst=null perm=null proto=rpc
18:46:18.223 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/task_20250519184617548622401951361242_0183_r_000000 dst=null perm=null proto=rpc
18:46:18.223 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:18.224 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary/0/task_20250519184617548622401951361242_0183_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.224 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:18.225 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.226 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:18.227 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/.spark-staging-183 dst=null perm=null proto=rpc
18:46:18.227 INFO SparkHadoopWriter - Write Job job_20250519184617548622401951361242_0183 committed. Elapsed time: 6 ms.
18:46:18.228 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.229 INFO StateChange - BLOCK* allocate blk_1073741850_1026, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/header
18:46:18.230 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741850_1026 src: /127.0.0.1:57374 dest: /127.0.0.1:38019
18:46:18.231 INFO clienttrace - src: /127.0.0.1:57374, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741850_1026, duration(ns): 466300
18:46:18.231 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741850_1026, type=LAST_IN_PIPELINE terminating
18:46:18.232 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:18.233 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.234 INFO StateChange - BLOCK* allocate blk_1073741851_1027, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/terminator
18:46:18.235 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741851_1027 src: /127.0.0.1:57386 dest: /127.0.0.1:38019
18:46:18.236 INFO clienttrace - src: /127.0.0.1:57386, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741851_1027, duration(ns): 369157
18:46:18.236 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741851_1027, type=LAST_IN_PIPELINE terminating
18:46:18.237 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:18.238 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts dst=null perm=null proto=rpc
18:46:18.239 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.239 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:18.240 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam
18:46:18.240 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.242 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.242 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam done
18:46:18.242 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.243 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.parts dst=null perm=null proto=rpc
18:46:18.244 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.244 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.244 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.245 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.246 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.bai dst=null perm=null proto=rpc
18:46:18.246 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bai dst=null perm=null proto=rpc
18:46:18.248 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:18.249 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.250 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.sbi dst=null perm=null proto=rpc
18:46:18.251 INFO MemoryStore - Block broadcast_83 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
18:46:18.258 INFO MemoryStore - Block broadcast_83_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1914.7 MiB)
18:46:18.258 INFO BlockManagerInfo - Added broadcast_83_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.0 MiB)
18:46:18.258 INFO SparkContext - Created broadcast 83 from newAPIHadoopFile at PathSplitSource.java:96
18:46:18.280 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.280 INFO FileInputFormat - Total input files to process : 1
18:46:18.281 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.318 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:18.318 INFO DAGScheduler - Got job 37 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:18.318 INFO DAGScheduler - Final stage: ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:18.318 INFO DAGScheduler - Parents of final stage: List()
18:46:18.318 INFO DAGScheduler - Missing parents: List()
18:46:18.319 INFO DAGScheduler - Submitting ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:18.343 INFO MemoryStore - Block broadcast_84 stored as values in memory (estimated size 426.2 KiB, free 1914.3 MiB)
18:46:18.345 INFO MemoryStore - Block broadcast_84_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1914.2 MiB)
18:46:18.345 INFO BlockManagerInfo - Added broadcast_84_piece0 in memory on localhost:45727 (size: 153.7 KiB, free: 1918.9 MiB)
18:46:18.345 INFO SparkContext - Created broadcast 84 from broadcast at DAGScheduler.scala:1580
18:46:18.346 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:18.346 INFO TaskSchedulerImpl - Adding task set 51.0 with 1 tasks resource profile 0
18:46:18.346 INFO TaskSetManager - Starting task 0.0 in stage 51.0 (TID 89) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:18.347 INFO Executor - Running task 0.0 in stage 51.0 (TID 89)
18:46:18.379 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam:0+237038
18:46:18.380 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.383 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:18.383 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.384 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.385 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.bai dst=null perm=null proto=rpc
18:46:18.385 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bai dst=null perm=null proto=rpc
18:46:18.387 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:18.398 INFO BlockManagerInfo - Removed broadcast_82_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1918.9 MiB)
18:46:18.399 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.399 INFO BlockManagerInfo - Removed broadcast_81_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.1 MiB)
18:46:18.400 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.400 INFO BlockManagerInfo - Removed broadcast_70_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.2 MiB)
18:46:18.401 INFO BlockManagerInfo - Removed broadcast_75_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:18.402 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.403 INFO BlockManagerInfo - Removed broadcast_74_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.4 MiB)
18:46:18.403 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:18.403 INFO BlockManagerInfo - Removed broadcast_80_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:18.404 INFO BlockManagerInfo - Removed broadcast_79_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:18.405 INFO BlockManagerInfo - Removed broadcast_73_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:18.406 INFO BlockManagerInfo - Removed broadcast_78_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:18.407 INFO BlockManagerInfo - Removed broadcast_76_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.6 MiB)
18:46:18.408 INFO BlockManagerInfo - Removed broadcast_69_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:18.408 INFO BlockManagerInfo - Removed broadcast_68_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:18.409 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.410 INFO BlockManagerInfo - Removed broadcast_71_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.7 MiB)
18:46:18.410 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.411 INFO BlockManagerInfo - Removed broadcast_66_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:18.411 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.412 INFO BlockManagerInfo - Removed broadcast_72_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.8 MiB)
18:46:18.412 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.413 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.414 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.415 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.415 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.417 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.418 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.419 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.420 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.421 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.421 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.423 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.424 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.425 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.427 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.428 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.428 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.429 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.430 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.431 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.432 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.432 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.433 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.434 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.435 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.436 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.438 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.439 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.440 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.440 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.442 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.443 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.444 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.445 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.446 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.447 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.447 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.448 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.449 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.450 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.451 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.451 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.452 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.454 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.455 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.455 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.456 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.457 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.458 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.458 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.460 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.461 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.462 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.463 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.463 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.464 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.465 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.bai dst=null perm=null proto=rpc
18:46:18.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bai dst=null perm=null proto=rpc
18:46:18.467 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:18.470 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.471 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:18.475 INFO Executor - Finished task 0.0 in stage 51.0 (TID 89). 651569 bytes result sent to driver
18:46:18.478 INFO TaskSetManager - Finished task 0.0 in stage 51.0 (TID 89) in 132 ms on localhost (executor driver) (1/1)
18:46:18.478 INFO TaskSchedulerImpl - Removed TaskSet 51.0, whose tasks have all completed, from pool
18:46:18.478 INFO DAGScheduler - ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.159 s
18:46:18.478 INFO DAGScheduler - Job 37 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:18.478 INFO TaskSchedulerImpl - Killing all running tasks in stage 51: Stage finished
18:46:18.478 INFO DAGScheduler - Job 37 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.160486 s
18:46:18.489 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:18.489 INFO DAGScheduler - Got job 38 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:18.489 INFO DAGScheduler - Final stage: ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185)
18:46:18.489 INFO DAGScheduler - Parents of final stage: List()
18:46:18.489 INFO DAGScheduler - Missing parents: List()
18:46:18.490 INFO DAGScheduler - Submitting ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:18.510 INFO MemoryStore - Block broadcast_85 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
18:46:18.511 INFO MemoryStore - Block broadcast_85_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
18:46:18.511 INFO BlockManagerInfo - Added broadcast_85_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.6 MiB)
18:46:18.512 INFO SparkContext - Created broadcast 85 from broadcast at DAGScheduler.scala:1580
18:46:18.512 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:18.512 INFO TaskSchedulerImpl - Adding task set 52.0 with 1 tasks resource profile 0
18:46:18.513 INFO TaskSetManager - Starting task 0.0 in stage 52.0 (TID 90) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:18.513 INFO Executor - Running task 0.0 in stage 52.0 (TID 90)
18:46:18.545 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:18.556 INFO Executor - Finished task 0.0 in stage 52.0 (TID 90). 989 bytes result sent to driver
18:46:18.557 INFO TaskSetManager - Finished task 0.0 in stage 52.0 (TID 90) in 44 ms on localhost (executor driver) (1/1)
18:46:18.557 INFO TaskSchedulerImpl - Removed TaskSet 52.0, whose tasks have all completed, from pool
18:46:18.557 INFO DAGScheduler - ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.067 s
18:46:18.557 INFO DAGScheduler - Job 38 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:18.557 INFO TaskSchedulerImpl - Killing all running tasks in stage 52: Stage finished
18:46:18.557 INFO DAGScheduler - Job 38 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.068475 s
18:46:18.561 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:18.561 INFO DAGScheduler - Got job 39 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:18.561 INFO DAGScheduler - Final stage: ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185)
18:46:18.561 INFO DAGScheduler - Parents of final stage: List()
18:46:18.561 INFO DAGScheduler - Missing parents: List()
18:46:18.562 INFO DAGScheduler - Submitting ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:18.579 INFO MemoryStore - Block broadcast_86 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
18:46:18.580 INFO MemoryStore - Block broadcast_86_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
18:46:18.581 INFO BlockManagerInfo - Added broadcast_86_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:18.581 INFO SparkContext - Created broadcast 86 from broadcast at DAGScheduler.scala:1580
18:46:18.581 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:18.581 INFO TaskSchedulerImpl - Adding task set 53.0 with 1 tasks resource profile 0
18:46:18.582 INFO TaskSetManager - Starting task 0.0 in stage 53.0 (TID 91) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:18.582 INFO Executor - Running task 0.0 in stage 53.0 (TID 91)
18:46:18.612 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam:0+237038
18:46:18.613 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.614 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.615 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.616 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.617 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.bai dst=null perm=null proto=rpc
18:46:18.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bai dst=null perm=null proto=rpc
18:46:18.619 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:18.621 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.621 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.622 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.623 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:18.627 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.628 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.629 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.629 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.631 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.632 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.633 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.635 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.636 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.637 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.638 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.639 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.640 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.641 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.642 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.644 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.645 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.647 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.648 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.648 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.649 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.650 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.651 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.652 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.653 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.654 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.655 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.656 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.656 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.657 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.658 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.659 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.660 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.661 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.661 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.663 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.664 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.664 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.666 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.667 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.668 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.670 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.671 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.673 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.674 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.676 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.676 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.677 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.678 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:18.679 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.679 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.680 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam dst=null perm=null proto=rpc
18:46:18.681 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bam.bai dst=null perm=null proto=rpc
18:46:18.682 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8ef2b598-9cf7-4d34-a629-b81af8667e50.bai dst=null perm=null proto=rpc
18:46:18.685 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:18.686 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:18.687 INFO Executor - Finished task 0.0 in stage 53.0 (TID 91). 989 bytes result sent to driver
18:46:18.688 INFO TaskSetManager - Finished task 0.0 in stage 53.0 (TID 91) in 106 ms on localhost (executor driver) (1/1)
18:46:18.688 INFO TaskSchedulerImpl - Removed TaskSet 53.0, whose tasks have all completed, from pool
18:46:18.688 INFO DAGScheduler - ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.126 s
18:46:18.688 INFO DAGScheduler - Job 39 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:18.688 INFO TaskSchedulerImpl - Killing all running tasks in stage 53: Stage finished
18:46:18.688 INFO DAGScheduler - Job 39 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.127257 s
18:46:18.694 INFO MemoryStore - Block broadcast_87 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
18:46:18.705 INFO MemoryStore - Block broadcast_87_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
18:46:18.705 INFO BlockManagerInfo - Added broadcast_87_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.4 MiB)
18:46:18.705 INFO SparkContext - Created broadcast 87 from newAPIHadoopFile at PathSplitSource.java:96
18:46:18.730 INFO MemoryStore - Block broadcast_88 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
18:46:18.736 INFO MemoryStore - Block broadcast_88_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.9 MiB)
18:46:18.736 INFO BlockManagerInfo - Added broadcast_88_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.4 MiB)
18:46:18.737 INFO SparkContext - Created broadcast 88 from newAPIHadoopFile at PathSplitSource.java:96
18:46:18.758 INFO FileInputFormat - Total input files to process : 1
18:46:18.760 INFO MemoryStore - Block broadcast_89 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
18:46:18.761 INFO MemoryStore - Block broadcast_89_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
18:46:18.761 INFO BlockManagerInfo - Added broadcast_89_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:18.761 INFO SparkContext - Created broadcast 89 from broadcast at ReadsSparkSink.java:133
18:46:18.762 INFO MemoryStore - Block broadcast_90 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
18:46:18.763 INFO MemoryStore - Block broadcast_90_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
18:46:18.763 INFO BlockManagerInfo - Added broadcast_90_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:18.763 INFO SparkContext - Created broadcast 90 from broadcast at BamSink.java:76
18:46:18.766 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts dst=null perm=null proto=rpc
18:46:18.766 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:18.766 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:18.766 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:18.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:18.773 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:18.774 INFO DAGScheduler - Registering RDD 204 (mapToPair at SparkUtils.java:161) as input to shuffle 12
18:46:18.774 INFO DAGScheduler - Got job 40 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:18.774 INFO DAGScheduler - Final stage: ResultStage 55 (runJob at SparkHadoopWriter.scala:83)
18:46:18.774 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 54)
18:46:18.774 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 54)
18:46:18.775 INFO DAGScheduler - Submitting ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:18.793 INFO MemoryStore - Block broadcast_91 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
18:46:18.794 INFO MemoryStore - Block broadcast_91_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
18:46:18.794 INFO BlockManagerInfo - Added broadcast_91_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.2 MiB)
18:46:18.795 INFO SparkContext - Created broadcast 91 from broadcast at DAGScheduler.scala:1580
18:46:18.795 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:18.795 INFO TaskSchedulerImpl - Adding task set 54.0 with 1 tasks resource profile 0
18:46:18.796 INFO TaskSetManager - Starting task 0.0 in stage 54.0 (TID 92) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
18:46:18.796 INFO Executor - Running task 0.0 in stage 54.0 (TID 92)
18:46:18.827 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:18.847 INFO Executor - Finished task 0.0 in stage 54.0 (TID 92). 1148 bytes result sent to driver
18:46:18.848 INFO TaskSetManager - Finished task 0.0 in stage 54.0 (TID 92) in 52 ms on localhost (executor driver) (1/1)
18:46:18.848 INFO TaskSchedulerImpl - Removed TaskSet 54.0, whose tasks have all completed, from pool
18:46:18.848 INFO DAGScheduler - ShuffleMapStage 54 (mapToPair at SparkUtils.java:161) finished in 0.073 s
18:46:18.848 INFO DAGScheduler - looking for newly runnable stages
18:46:18.848 INFO DAGScheduler - running: HashSet()
18:46:18.848 INFO DAGScheduler - waiting: HashSet(ResultStage 55)
18:46:18.848 INFO DAGScheduler - failed: HashSet()
18:46:18.848 INFO DAGScheduler - Submitting ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91), which has no missing parents
18:46:18.860 INFO MemoryStore - Block broadcast_92 stored as values in memory (estimated size 241.5 KiB, free 1915.7 MiB)
18:46:18.861 INFO MemoryStore - Block broadcast_92_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.6 MiB)
18:46:18.861 INFO BlockManagerInfo - Added broadcast_92_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.1 MiB)
18:46:18.861 INFO SparkContext - Created broadcast 92 from broadcast at DAGScheduler.scala:1580
18:46:18.861 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:18.861 INFO TaskSchedulerImpl - Adding task set 55.0 with 1 tasks resource profile 0
18:46:18.862 INFO TaskSetManager - Starting task 0.0 in stage 55.0 (TID 93) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:18.862 INFO Executor - Running task 0.0 in stage 55.0 (TID 93)
18:46:18.870 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:18.870 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:18.888 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:18.888 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:18.888 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:18.888 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:18.888 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:18.888 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:18.890 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.891 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.892 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:18.896 INFO StateChange - BLOCK* allocate blk_1073741852_1028, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/part-r-00000
18:46:18.898 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741852_1028 src: /127.0.0.1:58036 dest: /127.0.0.1:38019
18:46:18.900 INFO clienttrace - src: /127.0.0.1:58036, dest: /127.0.0.1:38019, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741852_1028, duration(ns): 1679786
18:46:18.901 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741852_1028, type=LAST_IN_PIPELINE terminating
18:46:18.901 INFO FSNamesystem - BLOCK* blk_1073741852_1028 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/part-r-00000
18:46:19.302 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:19.303 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:19.304 INFO StateChange - BLOCK* allocate blk_1073741853_1029, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/.part-r-00000.sbi
18:46:19.305 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741853_1029 src: /127.0.0.1:58050 dest: /127.0.0.1:38019
18:46:19.306 INFO clienttrace - src: /127.0.0.1:58050, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741853_1029, duration(ns): 483382
18:46:19.307 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741853_1029, type=LAST_IN_PIPELINE terminating
18:46:19.307 INFO FSNamesystem - BLOCK* blk_1073741853_1029 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/.part-r-00000.sbi
18:46:19.708 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:19.710 INFO StateChange - BLOCK* allocate blk_1073741854_1030, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/.part-r-00000.bai
18:46:19.711 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741854_1030 src: /127.0.0.1:58056 dest: /127.0.0.1:38019
18:46:19.712 INFO clienttrace - src: /127.0.0.1:58056, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741854_1030, duration(ns): 446722
18:46:19.712 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741854_1030, type=LAST_IN_PIPELINE terminating
18:46:19.713 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:19.714 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0 dst=null perm=null proto=rpc
18:46:19.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0 dst=null perm=null proto=rpc
18:46:19.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/task_202505191846182717560759313762939_0209_r_000000 dst=null perm=null proto=rpc
18:46:19.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/_temporary/attempt_202505191846182717560759313762939_0209_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/task_202505191846182717560759313762939_0209_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:19.716 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846182717560759313762939_0209_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/task_202505191846182717560759313762939_0209_r_000000
18:46:19.716 INFO SparkHadoopMapRedUtil - attempt_202505191846182717560759313762939_0209_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:19.717 INFO Executor - Finished task 0.0 in stage 55.0 (TID 93). 1858 bytes result sent to driver
18:46:19.717 INFO TaskSetManager - Finished task 0.0 in stage 55.0 (TID 93) in 855 ms on localhost (executor driver) (1/1)
18:46:19.718 INFO TaskSchedulerImpl - Removed TaskSet 55.0, whose tasks have all completed, from pool
18:46:19.718 INFO DAGScheduler - ResultStage 55 (runJob at SparkHadoopWriter.scala:83) finished in 0.869 s
18:46:19.718 INFO DAGScheduler - Job 40 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:19.718 INFO TaskSchedulerImpl - Killing all running tasks in stage 55: Stage finished
18:46:19.718 INFO DAGScheduler - Job 40 finished: runJob at SparkHadoopWriter.scala:83, took 0.944802 s
18:46:19.719 INFO SparkHadoopWriter - Start to commit write Job job_202505191846182717560759313762939_0209.
18:46:19.720 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:19.720 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts dst=null perm=null proto=rpc
18:46:19.721 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/task_202505191846182717560759313762939_0209_r_000000 dst=null perm=null proto=rpc
18:46:19.721 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:19.722 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/task_202505191846182717560759313762939_0209_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:19.722 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:19.723 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/task_202505191846182717560759313762939_0209_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:19.724 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:19.724 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary/0/task_202505191846182717560759313762939_0209_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:19.725 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:19.726 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:19.726 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:19.727 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.spark-staging-209 dst=null perm=null proto=rpc
18:46:19.727 INFO SparkHadoopWriter - Write Job job_202505191846182717560759313762939_0209 committed. Elapsed time: 8 ms.
18:46:19.728 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:19.730 INFO StateChange - BLOCK* allocate blk_1073741855_1031, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/header
18:46:19.731 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741855_1031 src: /127.0.0.1:58064 dest: /127.0.0.1:38019
18:46:19.732 INFO clienttrace - src: /127.0.0.1:58064, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741855_1031, duration(ns): 503370
18:46:19.732 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741855_1031, type=LAST_IN_PIPELINE terminating
18:46:19.733 INFO FSNamesystem - BLOCK* blk_1073741855_1031 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/header
18:46:20.134 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:20.135 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:20.136 INFO StateChange - BLOCK* allocate blk_1073741856_1032, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/terminator
18:46:20.137 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741856_1032 src: /127.0.0.1:58072 dest: /127.0.0.1:38019
18:46:20.138 INFO clienttrace - src: /127.0.0.1:58072, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741856_1032, duration(ns): 451014
18:46:20.138 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741856_1032, type=LAST_IN_PIPELINE terminating
18:46:20.139 INFO FSNamesystem - BLOCK* blk_1073741856_1032 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/terminator
18:46:20.540 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:20.541 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts dst=null perm=null proto=rpc
18:46:20.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:20.543 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:20.543 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam
18:46:20.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:20.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:20.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:20.545 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam done
18:46:20.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:20.546 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi
18:46:20.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts dst=null perm=null proto=rpc
18:46:20.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:20.548 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:20.549 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:20.550 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:20.550 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:20.551 INFO StateChange - BLOCK* allocate blk_1073741857_1033, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi
18:46:20.552 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741857_1033 src: /127.0.0.1:58076 dest: /127.0.0.1:38019
18:46:20.554 INFO clienttrace - src: /127.0.0.1:58076, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741857_1033, duration(ns): 845950
18:46:20.554 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741857_1033, type=LAST_IN_PIPELINE terminating
18:46:20.555 INFO FSNamesystem - BLOCK* blk_1073741857_1033 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi
18:46:20.956 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:20.957 INFO IndexFileMerger - Done merging .sbi files
18:46:20.957 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai
18:46:20.958 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts dst=null perm=null proto=rpc
18:46:20.959 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:20.960 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:20.960 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:20.961 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:20.962 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:20.964 INFO StateChange - BLOCK* allocate blk_1073741858_1034, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai
18:46:20.965 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741858_1034 src: /127.0.0.1:58086 dest: /127.0.0.1:38019
18:46:20.966 INFO clienttrace - src: /127.0.0.1:58086, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741858_1034, duration(ns): 468955
18:46:20.966 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741858_1034, type=LAST_IN_PIPELINE terminating
18:46:20.967 INFO FSNamesystem - BLOCK* blk_1073741858_1034 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai
18:46:21.368 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:21.368 INFO IndexFileMerger - Done merging .bai files
18:46:21.369 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.parts dst=null perm=null proto=rpc
18:46:21.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.391 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi dst=null perm=null proto=rpc
18:46:21.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi dst=null perm=null proto=rpc
18:46:21.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi dst=null perm=null proto=rpc
18:46:21.394 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:21.394 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.396 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.396 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.397 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.398 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.399 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.399 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.401 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:21.404 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:21.404 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
18:46:21.404 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi dst=null perm=null proto=rpc
18:46:21.405 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi dst=null perm=null proto=rpc
18:46:21.405 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.sbi dst=null perm=null proto=rpc
18:46:21.406 INFO MemoryStore - Block broadcast_93 stored as values in memory (estimated size 320.0 B, free 1915.6 MiB)
18:46:21.407 INFO MemoryStore - Block broadcast_93_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.6 MiB)
18:46:21.407 INFO BlockManagerInfo - Added broadcast_93_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.1 MiB)
18:46:21.408 INFO SparkContext - Created broadcast 93 from broadcast at BamSource.java:104
18:46:21.409 INFO MemoryStore - Block broadcast_94 stored as values in memory (estimated size 297.9 KiB, free 1915.3 MiB)
18:46:21.415 INFO MemoryStore - Block broadcast_94_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
18:46:21.415 INFO BlockManagerInfo - Added broadcast_94_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.1 MiB)
18:46:21.415 INFO SparkContext - Created broadcast 94 from newAPIHadoopFile at PathSplitSource.java:96
18:46:21.425 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.425 INFO FileInputFormat - Total input files to process : 1
18:46:21.425 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.440 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:21.441 INFO DAGScheduler - Got job 41 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:21.441 INFO DAGScheduler - Final stage: ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:21.441 INFO DAGScheduler - Parents of final stage: List()
18:46:21.441 INFO DAGScheduler - Missing parents: List()
18:46:21.441 INFO DAGScheduler - Submitting ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:21.451 INFO MemoryStore - Block broadcast_95 stored as values in memory (estimated size 148.2 KiB, free 1915.1 MiB)
18:46:21.452 INFO MemoryStore - Block broadcast_95_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.1 MiB)
18:46:21.452 INFO BlockManagerInfo - Added broadcast_95_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.0 MiB)
18:46:21.453 INFO SparkContext - Created broadcast 95 from broadcast at DAGScheduler.scala:1580
18:46:21.453 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:21.453 INFO TaskSchedulerImpl - Adding task set 56.0 with 1 tasks resource profile 0
18:46:21.454 INFO TaskSetManager - Starting task 0.0 in stage 56.0 (TID 94) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:21.454 INFO Executor - Running task 0.0 in stage 56.0 (TID 94)
18:46:21.478 INFO BlockManagerInfo - Removed broadcast_83_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:21.479 INFO BlockManagerInfo - Removed broadcast_92_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.1 MiB)
18:46:21.480 INFO BlockManagerInfo - Removed broadcast_84_piece0 on localhost:45727 in memory (size: 153.7 KiB, free: 1919.3 MiB)
18:46:21.481 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam:0+235514
18:46:21.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.482 INFO BlockManagerInfo - Removed broadcast_77_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:21.483 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.483 INFO BlockManagerInfo - Removed broadcast_88_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.4 MiB)
18:46:21.484 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.484 INFO BlockManagerInfo - Removed broadcast_86_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:21.485 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.485 INFO BlockManagerInfo - Removed broadcast_85_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:21.485 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.485 INFO BlockManagerInfo - Removed broadcast_90_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:21.486 INFO BlockManagerInfo - Removed broadcast_91_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.8 MiB)
18:46:21.487 INFO BlockManagerInfo - Removed broadcast_89_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:21.487 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:21.489 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:21.490 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:21.491 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:21.494 INFO Executor - Finished task 0.0 in stage 56.0 (TID 94). 650184 bytes result sent to driver
18:46:21.496 INFO TaskSetManager - Finished task 0.0 in stage 56.0 (TID 94) in 42 ms on localhost (executor driver) (1/1)
18:46:21.496 INFO TaskSchedulerImpl - Removed TaskSet 56.0, whose tasks have all completed, from pool
18:46:21.496 INFO DAGScheduler - ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.055 s
18:46:21.496 INFO DAGScheduler - Job 41 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:21.496 INFO TaskSchedulerImpl - Killing all running tasks in stage 56: Stage finished
18:46:21.497 INFO DAGScheduler - Job 41 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.056320 s
18:46:21.506 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:21.507 INFO DAGScheduler - Got job 42 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:21.507 INFO DAGScheduler - Final stage: ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185)
18:46:21.507 INFO DAGScheduler - Parents of final stage: List()
18:46:21.507 INFO DAGScheduler - Missing parents: List()
18:46:21.507 INFO DAGScheduler - Submitting ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:21.534 INFO MemoryStore - Block broadcast_96 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
18:46:21.535 INFO MemoryStore - Block broadcast_96_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
18:46:21.535 INFO BlockManagerInfo - Added broadcast_96_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.7 MiB)
18:46:21.535 INFO SparkContext - Created broadcast 96 from broadcast at DAGScheduler.scala:1580
18:46:21.536 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:21.536 INFO TaskSchedulerImpl - Adding task set 57.0 with 1 tasks resource profile 0
18:46:21.536 INFO TaskSetManager - Starting task 0.0 in stage 57.0 (TID 95) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
18:46:21.537 INFO Executor - Running task 0.0 in stage 57.0 (TID 95)
18:46:21.572 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:21.584 INFO Executor - Finished task 0.0 in stage 57.0 (TID 95). 989 bytes result sent to driver
18:46:21.585 INFO TaskSetManager - Finished task 0.0 in stage 57.0 (TID 95) in 49 ms on localhost (executor driver) (1/1)
18:46:21.585 INFO TaskSchedulerImpl - Removed TaskSet 57.0, whose tasks have all completed, from pool
18:46:21.585 INFO DAGScheduler - ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.078 s
18:46:21.585 INFO DAGScheduler - Job 42 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:21.585 INFO TaskSchedulerImpl - Killing all running tasks in stage 57: Stage finished
18:46:21.585 INFO DAGScheduler - Job 42 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.079171 s
18:46:21.589 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:21.589 INFO DAGScheduler - Got job 43 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:21.589 INFO DAGScheduler - Final stage: ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185)
18:46:21.589 INFO DAGScheduler - Parents of final stage: List()
18:46:21.589 INFO DAGScheduler - Missing parents: List()
18:46:21.590 INFO DAGScheduler - Submitting ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:21.596 INFO MemoryStore - Block broadcast_97 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
18:46:21.597 INFO MemoryStore - Block broadcast_97_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
18:46:21.597 INFO BlockManagerInfo - Added broadcast_97_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:21.597 INFO SparkContext - Created broadcast 97 from broadcast at DAGScheduler.scala:1580
18:46:21.597 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:21.597 INFO TaskSchedulerImpl - Adding task set 58.0 with 1 tasks resource profile 0
18:46:21.598 INFO TaskSetManager - Starting task 0.0 in stage 58.0 (TID 96) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:21.598 INFO Executor - Running task 0.0 in stage 58.0 (TID 96)
18:46:21.615 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam:0+235514
18:46:21.616 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.617 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam dst=null perm=null proto=rpc
18:46:21.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.619 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.619 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_6e5341b3-20b2-4032-bfce-c406131d5be5.bam.bai dst=null perm=null proto=rpc
18:46:21.621 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:21.623 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:21.623 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:21.625 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
18:46:21.625 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:21.627 INFO Executor - Finished task 0.0 in stage 58.0 (TID 96). 989 bytes result sent to driver
18:46:21.628 INFO TaskSetManager - Finished task 0.0 in stage 58.0 (TID 96) in 30 ms on localhost (executor driver) (1/1)
18:46:21.628 INFO TaskSchedulerImpl - Removed TaskSet 58.0, whose tasks have all completed, from pool
18:46:21.628 INFO DAGScheduler - ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.038 s
18:46:21.628 INFO DAGScheduler - Job 43 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:21.628 INFO TaskSchedulerImpl - Killing all running tasks in stage 58: Stage finished
18:46:21.628 INFO DAGScheduler - Job 43 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.039120 s
18:46:21.632 INFO MemoryStore - Block broadcast_98 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
18:46:21.643 INFO MemoryStore - Block broadcast_98_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:21.643 INFO BlockManagerInfo - Added broadcast_98_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:21.643 INFO SparkContext - Created broadcast 98 from newAPIHadoopFile at PathSplitSource.java:96
18:46:21.666 INFO MemoryStore - Block broadcast_99 stored as values in memory (estimated size 298.0 KiB, free 1917.7 MiB)
18:46:21.672 INFO MemoryStore - Block broadcast_99_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
18:46:21.673 INFO BlockManagerInfo - Added broadcast_99_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:21.673 INFO SparkContext - Created broadcast 99 from newAPIHadoopFile at PathSplitSource.java:96
18:46:21.694 INFO FileInputFormat - Total input files to process : 1
18:46:21.696 INFO MemoryStore - Block broadcast_100 stored as values in memory (estimated size 19.6 KiB, free 1917.7 MiB)
18:46:21.696 INFO MemoryStore - Block broadcast_100_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.7 MiB)
18:46:21.696 INFO BlockManagerInfo - Added broadcast_100_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.5 MiB)
18:46:21.697 INFO SparkContext - Created broadcast 100 from broadcast at ReadsSparkSink.java:133
18:46:21.697 INFO MemoryStore - Block broadcast_101 stored as values in memory (estimated size 20.0 KiB, free 1917.6 MiB)
18:46:21.698 INFO MemoryStore - Block broadcast_101_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.6 MiB)
18:46:21.698 INFO BlockManagerInfo - Added broadcast_101_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.5 MiB)
18:46:21.698 INFO SparkContext - Created broadcast 101 from broadcast at BamSink.java:76
18:46:21.701 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts dst=null perm=null proto=rpc
18:46:21.701 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:21.701 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:21.701 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:21.702 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:21.708 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:21.709 INFO DAGScheduler - Registering RDD 229 (mapToPair at SparkUtils.java:161) as input to shuffle 13
18:46:21.709 INFO DAGScheduler - Got job 44 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:21.709 INFO DAGScheduler - Final stage: ResultStage 60 (runJob at SparkHadoopWriter.scala:83)
18:46:21.709 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 59)
18:46:21.709 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 59)
18:46:21.709 INFO DAGScheduler - Submitting ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:21.733 INFO MemoryStore - Block broadcast_102 stored as values in memory (estimated size 434.3 KiB, free 1917.2 MiB)
18:46:21.734 INFO MemoryStore - Block broadcast_102_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1917.1 MiB)
18:46:21.734 INFO BlockManagerInfo - Added broadcast_102_piece0 in memory on localhost:45727 (size: 157.6 KiB, free: 1919.4 MiB)
18:46:21.735 INFO SparkContext - Created broadcast 102 from broadcast at DAGScheduler.scala:1580
18:46:21.735 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:21.735 INFO TaskSchedulerImpl - Adding task set 59.0 with 1 tasks resource profile 0
18:46:21.736 INFO TaskSetManager - Starting task 0.0 in stage 59.0 (TID 97) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
18:46:21.736 INFO Executor - Running task 0.0 in stage 59.0 (TID 97)
18:46:21.771 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:21.786 INFO Executor - Finished task 0.0 in stage 59.0 (TID 97). 1148 bytes result sent to driver
18:46:21.786 INFO TaskSetManager - Finished task 0.0 in stage 59.0 (TID 97) in 50 ms on localhost (executor driver) (1/1)
18:46:21.786 INFO TaskSchedulerImpl - Removed TaskSet 59.0, whose tasks have all completed, from pool
18:46:21.787 INFO DAGScheduler - ShuffleMapStage 59 (mapToPair at SparkUtils.java:161) finished in 0.077 s
18:46:21.787 INFO DAGScheduler - looking for newly runnable stages
18:46:21.787 INFO DAGScheduler - running: HashSet()
18:46:21.787 INFO DAGScheduler - waiting: HashSet(ResultStage 60)
18:46:21.787 INFO DAGScheduler - failed: HashSet()
18:46:21.787 INFO DAGScheduler - Submitting ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91), which has no missing parents
18:46:21.794 INFO MemoryStore - Block broadcast_103 stored as values in memory (estimated size 155.4 KiB, free 1916.9 MiB)
18:46:21.794 INFO MemoryStore - Block broadcast_103_piece0 stored as bytes in memory (estimated size 58.6 KiB, free 1916.8 MiB)
18:46:21.795 INFO BlockManagerInfo - Added broadcast_103_piece0 in memory on localhost:45727 (size: 58.6 KiB, free: 1919.3 MiB)
18:46:21.795 INFO SparkContext - Created broadcast 103 from broadcast at DAGScheduler.scala:1580
18:46:21.795 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:21.795 INFO TaskSchedulerImpl - Adding task set 60.0 with 1 tasks resource profile 0
18:46:21.796 INFO TaskSetManager - Starting task 0.0 in stage 60.0 (TID 98) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:21.796 INFO Executor - Running task 0.0 in stage 60.0 (TID 98)
18:46:21.800 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:21.800 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:21.814 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:21.814 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:21.814 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:21.814 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:21.814 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:21.814 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:21.815 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:21.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:21.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:21.820 INFO StateChange - BLOCK* allocate blk_1073741859_1035, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/part-r-00000
18:46:21.821 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741859_1035 src: /127.0.0.1:58126 dest: /127.0.0.1:38019
18:46:21.824 INFO clienttrace - src: /127.0.0.1:58126, dest: /127.0.0.1:38019, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741859_1035, duration(ns): 2054685
18:46:21.824 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741859_1035, type=LAST_IN_PIPELINE terminating
18:46:21.825 INFO FSNamesystem - BLOCK* blk_1073741859_1035 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/part-r-00000
18:46:22.226 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:22.227 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:22.228 INFO StateChange - BLOCK* allocate blk_1073741860_1036, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.sbi
18:46:22.229 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741860_1036 src: /127.0.0.1:58128 dest: /127.0.0.1:38019
18:46:22.230 INFO clienttrace - src: /127.0.0.1:58128, dest: /127.0.0.1:38019, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741860_1036, duration(ns): 413473
18:46:22.230 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741860_1036, type=LAST_IN_PIPELINE terminating
18:46:22.230 INFO FSNamesystem - BLOCK* blk_1073741860_1036 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.sbi
18:46:22.631 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:22.632 INFO StateChange - BLOCK* allocate blk_1073741861_1037, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.bai
18:46:22.633 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741861_1037 src: /127.0.0.1:58144 dest: /127.0.0.1:38019
18:46:22.634 INFO clienttrace - src: /127.0.0.1:58144, dest: /127.0.0.1:38019, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741861_1037, duration(ns): 439295
18:46:22.634 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741861_1037, type=LAST_IN_PIPELINE terminating
18:46:22.635 INFO FSNamesystem - BLOCK* blk_1073741861_1037 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.bai
18:46:23.036 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:23.037 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0 dst=null perm=null proto=rpc
18:46:23.038 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0 dst=null perm=null proto=rpc
18:46:23.039 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/task_2025051918462172644584382977541_0234_r_000000 dst=null perm=null proto=rpc
18:46:23.039 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/_temporary/attempt_2025051918462172644584382977541_0234_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/task_2025051918462172644584382977541_0234_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:23.039 INFO FileOutputCommitter - Saved output of task 'attempt_2025051918462172644584382977541_0234_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/task_2025051918462172644584382977541_0234_r_000000
18:46:23.040 INFO SparkHadoopMapRedUtil - attempt_2025051918462172644584382977541_0234_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:23.040 INFO Executor - Finished task 0.0 in stage 60.0 (TID 98). 1858 bytes result sent to driver
18:46:23.041 INFO TaskSetManager - Finished task 0.0 in stage 60.0 (TID 98) in 1245 ms on localhost (executor driver) (1/1)
18:46:23.041 INFO TaskSchedulerImpl - Removed TaskSet 60.0, whose tasks have all completed, from pool
18:46:23.041 INFO DAGScheduler - ResultStage 60 (runJob at SparkHadoopWriter.scala:83) finished in 1.254 s
18:46:23.041 INFO DAGScheduler - Job 44 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:23.041 INFO TaskSchedulerImpl - Killing all running tasks in stage 60: Stage finished
18:46:23.042 INFO DAGScheduler - Job 44 finished: runJob at SparkHadoopWriter.scala:83, took 1.333600 s
18:46:23.043 INFO SparkHadoopWriter - Start to commit write Job job_2025051918462172644584382977541_0234.
18:46:23.043 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:23.044 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts dst=null perm=null proto=rpc
18:46:23.044 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/task_2025051918462172644584382977541_0234_r_000000 dst=null perm=null proto=rpc
18:46:23.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:23.046 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/task_2025051918462172644584382977541_0234_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.046 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:23.047 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/task_2025051918462172644584382977541_0234_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.047 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:23.048 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary/0/task_2025051918462172644584382977541_0234_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.049 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:23.049 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.050 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:23.051 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.spark-staging-234 dst=null perm=null proto=rpc
18:46:23.051 INFO SparkHadoopWriter - Write Job job_2025051918462172644584382977541_0234 committed. Elapsed time: 8 ms.
18:46:23.052 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.053 INFO StateChange - BLOCK* allocate blk_1073741862_1038, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/header
18:46:23.054 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741862_1038 src: /127.0.0.1:58160 dest: /127.0.0.1:38019
18:46:23.055 INFO clienttrace - src: /127.0.0.1:58160, dest: /127.0.0.1:38019, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741862_1038, duration(ns): 449466
18:46:23.055 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741862_1038, type=LAST_IN_PIPELINE terminating
18:46:23.056 INFO FSNamesystem - BLOCK* blk_1073741862_1038 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/header
18:46:23.457 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:23.458 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.459 INFO StateChange - BLOCK* allocate blk_1073741863_1039, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/terminator
18:46:23.460 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741863_1039 src: /127.0.0.1:58166 dest: /127.0.0.1:38019
18:46:23.461 INFO clienttrace - src: /127.0.0.1:58166, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741863_1039, duration(ns): 448082
18:46:23.461 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741863_1039, type=LAST_IN_PIPELINE terminating
18:46:23.462 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:23.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts dst=null perm=null proto=rpc
18:46:23.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.464 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:23.464 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam
18:46:23.465 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.465 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.466 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam done
18:46:23.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.467 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi
18:46:23.467 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts dst=null perm=null proto=rpc
18:46:23.468 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.468 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:23.469 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:23.470 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
18:46:23.470 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:23.471 INFO StateChange - BLOCK* allocate blk_1073741864_1040, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi
18:46:23.472 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741864_1040 src: /127.0.0.1:58176 dest: /127.0.0.1:38019
18:46:23.473 INFO clienttrace - src: /127.0.0.1:58176, dest: /127.0.0.1:38019, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741864_1040, duration(ns): 359509
18:46:23.473 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741864_1040, type=LAST_IN_PIPELINE terminating
18:46:23.474 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:23.474 INFO IndexFileMerger - Done merging .sbi files
18:46:23.474 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai
18:46:23.475 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts dst=null perm=null proto=rpc
18:46:23.476 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:23.476 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:23.477 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:23.478 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:23.478 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:23.480 INFO StateChange - BLOCK* allocate blk_1073741865_1041, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai
18:46:23.481 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741865_1041 src: /127.0.0.1:58186 dest: /127.0.0.1:38019
18:46:23.482 INFO clienttrace - src: /127.0.0.1:58186, dest: /127.0.0.1:38019, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741865_1041, duration(ns): 369628
18:46:23.482 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741865_1041, type=LAST_IN_PIPELINE terminating
18:46:23.482 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:23.483 INFO IndexFileMerger - Done merging .bai files
18:46:23.483 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.parts dst=null perm=null proto=rpc
18:46:23.496 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi dst=null perm=null proto=rpc
18:46:23.504 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi dst=null perm=null proto=rpc
18:46:23.504 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi dst=null perm=null proto=rpc
18:46:23.505 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
18:46:23.506 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.506 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.508 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.511 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
18:46:23.512 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:23.512 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:23.513 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
18:46:23.513 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi dst=null perm=null proto=rpc
18:46:23.513 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi dst=null perm=null proto=rpc
18:46:23.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.sbi dst=null perm=null proto=rpc
18:46:23.515 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
18:46:23.515 INFO MemoryStore - Block broadcast_104 stored as values in memory (estimated size 312.0 B, free 1916.8 MiB)
18:46:23.516 INFO MemoryStore - Block broadcast_104_piece0 stored as bytes in memory (estimated size 231.0 B, free 1916.8 MiB)
18:46:23.516 INFO BlockManagerInfo - Added broadcast_104_piece0 in memory on localhost:45727 (size: 231.0 B, free: 1919.3 MiB)
18:46:23.517 INFO SparkContext - Created broadcast 104 from broadcast at BamSource.java:104
18:46:23.518 INFO MemoryStore - Block broadcast_105 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
18:46:23.528 INFO MemoryStore - Block broadcast_105_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.5 MiB)
18:46:23.528 INFO BlockManagerInfo - Added broadcast_105_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:23.529 INFO SparkContext - Created broadcast 105 from newAPIHadoopFile at PathSplitSource.java:96
18:46:23.543 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.544 INFO FileInputFormat - Total input files to process : 1
18:46:23.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.559 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:23.559 INFO DAGScheduler - Got job 45 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:23.559 INFO DAGScheduler - Final stage: ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:23.559 INFO DAGScheduler - Parents of final stage: List()
18:46:23.559 INFO DAGScheduler - Missing parents: List()
18:46:23.559 INFO DAGScheduler - Submitting ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:23.566 INFO MemoryStore - Block broadcast_106 stored as values in memory (estimated size 148.2 KiB, free 1916.4 MiB)
18:46:23.567 INFO MemoryStore - Block broadcast_106_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.3 MiB)
18:46:23.567 INFO BlockManagerInfo - Added broadcast_106_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.2 MiB)
18:46:23.567 INFO SparkContext - Created broadcast 106 from broadcast at DAGScheduler.scala:1580
18:46:23.568 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:23.568 INFO TaskSchedulerImpl - Adding task set 61.0 with 1 tasks resource profile 0
18:46:23.568 INFO TaskSetManager - Starting task 0.0 in stage 61.0 (TID 99) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:23.569 INFO Executor - Running task 0.0 in stage 61.0 (TID 99)
18:46:23.582 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam:0+236517
18:46:23.583 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.583 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.584 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.585 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.585 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.587 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
18:46:23.589 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:23.592 INFO Executor - Finished task 0.0 in stage 61.0 (TID 99). 749470 bytes result sent to driver
18:46:23.594 INFO TaskSetManager - Finished task 0.0 in stage 61.0 (TID 99) in 26 ms on localhost (executor driver) (1/1)
18:46:23.594 INFO TaskSchedulerImpl - Removed TaskSet 61.0, whose tasks have all completed, from pool
18:46:23.594 INFO DAGScheduler - ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.034 s
18:46:23.594 INFO DAGScheduler - Job 45 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:23.594 INFO TaskSchedulerImpl - Killing all running tasks in stage 61: Stage finished
18:46:23.595 INFO DAGScheduler - Job 45 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.035920 s
18:46:23.614 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:23.614 INFO DAGScheduler - Got job 46 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:23.614 INFO DAGScheduler - Final stage: ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185)
18:46:23.614 INFO DAGScheduler - Parents of final stage: List()
18:46:23.614 INFO DAGScheduler - Missing parents: List()
18:46:23.614 INFO DAGScheduler - Submitting ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:23.641 INFO MemoryStore - Block broadcast_107 stored as values in memory (estimated size 426.1 KiB, free 1915.9 MiB)
18:46:23.642 INFO MemoryStore - Block broadcast_107_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
18:46:23.643 INFO BlockManagerInfo - Added broadcast_107_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.1 MiB)
18:46:23.643 INFO SparkContext - Created broadcast 107 from broadcast at DAGScheduler.scala:1580
18:46:23.643 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:23.643 INFO TaskSchedulerImpl - Adding task set 62.0 with 1 tasks resource profile 0
18:46:23.644 INFO TaskSetManager - Starting task 0.0 in stage 62.0 (TID 100) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
18:46:23.644 INFO Executor - Running task 0.0 in stage 62.0 (TID 100)
18:46:23.674 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:23.684 INFO Executor - Finished task 0.0 in stage 62.0 (TID 100). 989 bytes result sent to driver
18:46:23.684 INFO TaskSetManager - Finished task 0.0 in stage 62.0 (TID 100) in 40 ms on localhost (executor driver) (1/1)
18:46:23.684 INFO TaskSchedulerImpl - Removed TaskSet 62.0, whose tasks have all completed, from pool
18:46:23.685 INFO DAGScheduler - ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.069 s
18:46:23.685 INFO DAGScheduler - Job 46 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:23.685 INFO TaskSchedulerImpl - Killing all running tasks in stage 62: Stage finished
18:46:23.685 INFO DAGScheduler - Job 46 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.070947 s
18:46:23.688 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:23.689 INFO DAGScheduler - Got job 47 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:23.689 INFO DAGScheduler - Final stage: ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185)
18:46:23.689 INFO DAGScheduler - Parents of final stage: List()
18:46:23.689 INFO DAGScheduler - Missing parents: List()
18:46:23.689 INFO DAGScheduler - Submitting ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:23.695 INFO MemoryStore - Block broadcast_108 stored as values in memory (estimated size 148.1 KiB, free 1915.6 MiB)
18:46:23.696 INFO MemoryStore - Block broadcast_108_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.5 MiB)
18:46:23.696 INFO BlockManagerInfo - Added broadcast_108_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.0 MiB)
18:46:23.697 INFO SparkContext - Created broadcast 108 from broadcast at DAGScheduler.scala:1580
18:46:23.697 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:23.697 INFO TaskSchedulerImpl - Adding task set 63.0 with 1 tasks resource profile 0
18:46:23.697 INFO TaskSetManager - Starting task 0.0 in stage 63.0 (TID 101) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:23.698 INFO Executor - Running task 0.0 in stage 63.0 (TID 101)
18:46:23.709 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam:0+236517
18:46:23.710 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.710 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam dst=null perm=null proto=rpc
18:46:23.711 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_a41a4a11-c8e5-4ddf-9219-26d83c0331f3.bam.bai dst=null perm=null proto=rpc
18:46:23.713 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
18:46:23.715 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:23.717 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:23.719 INFO Executor - Finished task 0.0 in stage 63.0 (TID 101). 989 bytes result sent to driver
18:46:23.719 INFO TaskSetManager - Finished task 0.0 in stage 63.0 (TID 101) in 22 ms on localhost (executor driver) (1/1)
18:46:23.719 INFO TaskSchedulerImpl - Removed TaskSet 63.0, whose tasks have all completed, from pool
18:46:23.720 INFO DAGScheduler - ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.031 s
18:46:23.720 INFO DAGScheduler - Job 47 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:23.720 INFO TaskSchedulerImpl - Killing all running tasks in stage 63: Stage finished
18:46:23.720 INFO DAGScheduler - Job 47 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.031612 s
18:46:23.727 INFO MemoryStore - Block broadcast_109 stored as values in memory (estimated size 576.0 B, free 1915.5 MiB)
18:46:23.735 INFO MemoryStore - Block broadcast_109_piece0 stored as bytes in memory (estimated size 228.0 B, free 1915.5 MiB)
18:46:23.735 INFO BlockManagerInfo - Added broadcast_109_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.0 MiB)
18:46:23.735 INFO BlockManagerInfo - Removed broadcast_107_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.2 MiB)
18:46:23.735 INFO SparkContext - Created broadcast 109 from broadcast at CramSource.java:114
18:46:23.736 INFO BlockManagerInfo - Removed broadcast_87_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.2 MiB)
18:46:23.737 INFO BlockManagerInfo - Removed broadcast_100_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.2 MiB)
18:46:23.737 INFO MemoryStore - Block broadcast_110 stored as values in memory (estimated size 297.9 KiB, free 1916.2 MiB)
18:46:23.739 INFO BlockManagerInfo - Removed broadcast_95_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.3 MiB)
18:46:23.739 INFO BlockManagerInfo - Removed broadcast_106_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.3 MiB)
18:46:23.741 INFO BlockManagerInfo - Removed broadcast_97_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.4 MiB)
18:46:23.743 INFO BlockManagerInfo - Removed broadcast_105_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:23.744 INFO BlockManagerInfo - Removed broadcast_101_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.4 MiB)
18:46:23.745 INFO BlockManagerInfo - Removed broadcast_96_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.6 MiB)
18:46:23.747 INFO BlockManagerInfo - Removed broadcast_103_piece0 on localhost:45727 in memory (size: 58.6 KiB, free: 1919.6 MiB)
18:46:23.747 INFO MemoryStore - Block broadcast_110_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
18:46:23.747 INFO BlockManagerInfo - Added broadcast_110_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:23.748 INFO SparkContext - Created broadcast 110 from newAPIHadoopFile at PathSplitSource.java:96
18:46:23.748 INFO BlockManagerInfo - Removed broadcast_102_piece0 on localhost:45727 in memory (size: 157.6 KiB, free: 1919.7 MiB)
18:46:23.749 INFO BlockManagerInfo - Removed broadcast_98_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:23.750 INFO BlockManagerInfo - Removed broadcast_94_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:23.750 INFO BlockManagerInfo - Removed broadcast_104_piece0 on localhost:45727 in memory (size: 231.0 B, free: 1919.8 MiB)
18:46:23.751 INFO BlockManagerInfo - Removed broadcast_108_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.9 MiB)
18:46:23.752 INFO BlockManagerInfo - Removed broadcast_93_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.9 MiB)
18:46:23.753 INFO BlockManagerInfo - Removed broadcast_99_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1920.0 MiB)
18:46:23.769 INFO MemoryStore - Block broadcast_111 stored as values in memory (estimated size 576.0 B, free 1919.7 MiB)
18:46:23.770 INFO MemoryStore - Block broadcast_111_piece0 stored as bytes in memory (estimated size 228.0 B, free 1919.7 MiB)
18:46:23.770 INFO BlockManagerInfo - Added broadcast_111_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1920.0 MiB)
18:46:23.770 INFO SparkContext - Created broadcast 111 from broadcast at CramSource.java:114
18:46:23.771 INFO MemoryStore - Block broadcast_112 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
18:46:23.778 INFO MemoryStore - Block broadcast_112_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
18:46:23.778 INFO BlockManagerInfo - Added broadcast_112_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.9 MiB)
18:46:23.778 INFO SparkContext - Created broadcast 112 from newAPIHadoopFile at PathSplitSource.java:96
18:46:23.795 INFO FileInputFormat - Total input files to process : 1
18:46:23.797 INFO MemoryStore - Block broadcast_113 stored as values in memory (estimated size 6.0 KiB, free 1919.3 MiB)
18:46:23.797 INFO MemoryStore - Block broadcast_113_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
18:46:23.798 INFO BlockManagerInfo - Added broadcast_113_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.9 MiB)
18:46:23.798 INFO SparkContext - Created broadcast 113 from broadcast at ReadsSparkSink.java:133
18:46:23.799 INFO MemoryStore - Block broadcast_114 stored as values in memory (estimated size 6.2 KiB, free 1919.3 MiB)
18:46:23.799 INFO MemoryStore - Block broadcast_114_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
18:46:23.799 INFO BlockManagerInfo - Added broadcast_114_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.9 MiB)
18:46:23.800 INFO SparkContext - Created broadcast 114 from broadcast at CramSink.java:76
18:46:23.805 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts dst=null perm=null proto=rpc
18:46:23.805 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:23.805 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:23.805 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:23.806 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:23.812 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:23.813 INFO DAGScheduler - Registering RDD 252 (mapToPair at SparkUtils.java:161) as input to shuffle 14
18:46:23.813 INFO DAGScheduler - Got job 48 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:23.813 INFO DAGScheduler - Final stage: ResultStage 65 (runJob at SparkHadoopWriter.scala:83)
18:46:23.813 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 64)
18:46:23.813 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 64)
18:46:23.813 INFO DAGScheduler - Submitting ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:23.826 INFO MemoryStore - Block broadcast_115 stored as values in memory (estimated size 292.8 KiB, free 1919.0 MiB)
18:46:23.827 INFO MemoryStore - Block broadcast_115_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1918.9 MiB)
18:46:23.827 INFO BlockManagerInfo - Added broadcast_115_piece0 in memory on localhost:45727 (size: 107.3 KiB, free: 1919.8 MiB)
18:46:23.827 INFO SparkContext - Created broadcast 115 from broadcast at DAGScheduler.scala:1580
18:46:23.827 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:23.827 INFO TaskSchedulerImpl - Adding task set 64.0 with 1 tasks resource profile 0
18:46:23.828 INFO TaskSetManager - Starting task 0.0 in stage 64.0 (TID 102) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
18:46:23.829 INFO Executor - Running task 0.0 in stage 64.0 (TID 102)
18:46:23.851 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:23.878 INFO Executor - Finished task 0.0 in stage 64.0 (TID 102). 1148 bytes result sent to driver
18:46:23.878 INFO TaskSetManager - Finished task 0.0 in stage 64.0 (TID 102) in 50 ms on localhost (executor driver) (1/1)
18:46:23.878 INFO TaskSchedulerImpl - Removed TaskSet 64.0, whose tasks have all completed, from pool
18:46:23.878 INFO DAGScheduler - ShuffleMapStage 64 (mapToPair at SparkUtils.java:161) finished in 0.064 s
18:46:23.879 INFO DAGScheduler - looking for newly runnable stages
18:46:23.879 INFO DAGScheduler - running: HashSet()
18:46:23.879 INFO DAGScheduler - waiting: HashSet(ResultStage 65)
18:46:23.879 INFO DAGScheduler - failed: HashSet()
18:46:23.879 INFO DAGScheduler - Submitting ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89), which has no missing parents
18:46:23.886 INFO MemoryStore - Block broadcast_116 stored as values in memory (estimated size 153.3 KiB, free 1918.8 MiB)
18:46:23.887 INFO MemoryStore - Block broadcast_116_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1918.7 MiB)
18:46:23.887 INFO BlockManagerInfo - Added broadcast_116_piece0 in memory on localhost:45727 (size: 58.1 KiB, free: 1919.7 MiB)
18:46:23.887 INFO SparkContext - Created broadcast 116 from broadcast at DAGScheduler.scala:1580
18:46:23.887 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
18:46:23.887 INFO TaskSchedulerImpl - Adding task set 65.0 with 1 tasks resource profile 0
18:46:23.888 INFO TaskSetManager - Starting task 0.0 in stage 65.0 (TID 103) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:23.889 INFO Executor - Running task 0.0 in stage 65.0 (TID 103)
18:46:23.896 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:23.897 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:23.908 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:23.909 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:23.909 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:23.909 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:23.909 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:23.909 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:23.912 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/_temporary/attempt_202505191846237686466574205386434_0257_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:24.032 INFO StateChange - BLOCK* allocate blk_1073741866_1042, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/_temporary/attempt_202505191846237686466574205386434_0257_r_000000_0/part-r-00000
18:46:24.033 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741866_1042 src: /127.0.0.1:58204 dest: /127.0.0.1:38019
18:46:24.035 INFO clienttrace - src: /127.0.0.1:58204, dest: /127.0.0.1:38019, bytes: 42659, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741866_1042, duration(ns): 524373
18:46:24.035 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741866_1042, type=LAST_IN_PIPELINE terminating
18:46:24.035 INFO FSNamesystem - BLOCK* blk_1073741866_1042 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/_temporary/attempt_202505191846237686466574205386434_0257_r_000000_0/part-r-00000
18:46:24.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741860_1036 replica FinalizedReplica, blk_1073741860_1036, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741860 for deletion
18:46:24.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741861_1037 replica FinalizedReplica, blk_1073741861_1037, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741861 for deletion
18:46:24.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741853_1029 replica FinalizedReplica, blk_1073741853_1029, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741853 for deletion
18:46:24.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741860_1036 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741860
18:46:24.046 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741854_1030 replica FinalizedReplica, blk_1073741854_1030, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741854 for deletion
18:46:24.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741854_1030 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741854
18:46:24.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741861_1037 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741861
18:46:24.046 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741853_1029 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741853
18:46:24.437 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/_temporary/attempt_202505191846237686466574205386434_0257_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:24.438 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/_temporary/attempt_202505191846237686466574205386434_0257_r_000000_0 dst=null perm=null proto=rpc
18:46:24.438 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/_temporary/attempt_202505191846237686466574205386434_0257_r_000000_0 dst=null perm=null proto=rpc
18:46:24.439 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/task_202505191846237686466574205386434_0257_r_000000 dst=null perm=null proto=rpc
18:46:24.440 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/_temporary/attempt_202505191846237686466574205386434_0257_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/task_202505191846237686466574205386434_0257_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:24.440 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846237686466574205386434_0257_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/task_202505191846237686466574205386434_0257_r_000000
18:46:24.440 INFO SparkHadoopMapRedUtil - attempt_202505191846237686466574205386434_0257_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:24.441 INFO Executor - Finished task 0.0 in stage 65.0 (TID 103). 1858 bytes result sent to driver
18:46:24.441 INFO TaskSetManager - Finished task 0.0 in stage 65.0 (TID 103) in 553 ms on localhost (executor driver) (1/1)
18:46:24.441 INFO TaskSchedulerImpl - Removed TaskSet 65.0, whose tasks have all completed, from pool
18:46:24.442 INFO DAGScheduler - ResultStage 65 (runJob at SparkHadoopWriter.scala:83) finished in 0.563 s
18:46:24.442 INFO DAGScheduler - Job 48 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:24.442 INFO TaskSchedulerImpl - Killing all running tasks in stage 65: Stage finished
18:46:24.442 INFO DAGScheduler - Job 48 finished: runJob at SparkHadoopWriter.scala:83, took 0.629718 s
18:46:24.442 INFO SparkHadoopWriter - Start to commit write Job job_202505191846237686466574205386434_0257.
18:46:24.443 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:24.444 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts dst=null perm=null proto=rpc
18:46:24.444 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/task_202505191846237686466574205386434_0257_r_000000 dst=null perm=null proto=rpc
18:46:24.445 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:24.445 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary/0/task_202505191846237686466574205386434_0257_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:24.446 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_temporary dst=null perm=null proto=rpc
18:46:24.447 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:24.448 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:24.448 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/.spark-staging-257 dst=null perm=null proto=rpc
18:46:24.449 INFO SparkHadoopWriter - Write Job job_202505191846237686466574205386434_0257 committed. Elapsed time: 6 ms.
18:46:24.449 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:24.451 INFO StateChange - BLOCK* allocate blk_1073741867_1043, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/header
18:46:24.452 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741867_1043 src: /127.0.0.1:58210 dest: /127.0.0.1:38019
18:46:24.453 INFO clienttrace - src: /127.0.0.1:58210, dest: /127.0.0.1:38019, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741867_1043, duration(ns): 395851
18:46:24.453 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741867_1043, type=LAST_IN_PIPELINE terminating
18:46:24.454 INFO FSNamesystem - BLOCK* blk_1073741867_1043 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/header
18:46:24.855 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:24.856 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:24.857 INFO StateChange - BLOCK* allocate blk_1073741868_1044, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/terminator
18:46:24.858 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741868_1044 src: /127.0.0.1:58226 dest: /127.0.0.1:38019
18:46:24.859 INFO clienttrace - src: /127.0.0.1:58226, dest: /127.0.0.1:38019, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741868_1044, duration(ns): 410576
18:46:24.859 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741868_1044, type=LAST_IN_PIPELINE terminating
18:46:24.859 INFO FSNamesystem - BLOCK* blk_1073741868_1044 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/terminator
18:46:25.260 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:25.261 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts dst=null perm=null proto=rpc
18:46:25.263 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:25.264 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:25.264 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram
18:46:25.264 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:25.265 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.265 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:25.265 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram done
18:46:25.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.parts dst=null perm=null proto=rpc
18:46:25.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.267 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.267 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.268 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.268 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.crai dst=null perm=null proto=rpc
18:46:25.269 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.crai dst=null perm=null proto=rpc
18:46:25.271 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
18:46:25.272 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
18:46:25.272 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:25.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.274 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.crai dst=null perm=null proto=rpc
18:46:25.274 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.crai dst=null perm=null proto=rpc
18:46:25.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.276 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
18:46:25.277 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
18:46:25.277 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:25.278 INFO MemoryStore - Block broadcast_117 stored as values in memory (estimated size 528.0 B, free 1918.7 MiB)
18:46:25.278 INFO MemoryStore - Block broadcast_117_piece0 stored as bytes in memory (estimated size 187.0 B, free 1918.7 MiB)
18:46:25.279 INFO BlockManagerInfo - Added broadcast_117_piece0 in memory on localhost:45727 (size: 187.0 B, free: 1919.7 MiB)
18:46:25.279 INFO SparkContext - Created broadcast 117 from broadcast at CramSource.java:114
18:46:25.280 INFO MemoryStore - Block broadcast_118 stored as values in memory (estimated size 297.9 KiB, free 1918.4 MiB)
18:46:25.287 INFO MemoryStore - Block broadcast_118_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
18:46:25.287 INFO BlockManagerInfo - Added broadcast_118_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:25.287 INFO SparkContext - Created broadcast 118 from newAPIHadoopFile at PathSplitSource.java:96
18:46:25.306 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.306 INFO FileInputFormat - Total input files to process : 1
18:46:25.307 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.333 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:25.333 INFO DAGScheduler - Got job 49 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:25.333 INFO DAGScheduler - Final stage: ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:25.333 INFO DAGScheduler - Parents of final stage: List()
18:46:25.333 INFO DAGScheduler - Missing parents: List()
18:46:25.333 INFO DAGScheduler - Submitting ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:25.345 INFO MemoryStore - Block broadcast_119 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
18:46:25.346 INFO MemoryStore - Block broadcast_119_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
18:46:25.346 INFO BlockManagerInfo - Added broadcast_119_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.6 MiB)
18:46:25.347 INFO SparkContext - Created broadcast 119 from broadcast at DAGScheduler.scala:1580
18:46:25.347 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:25.347 INFO TaskSchedulerImpl - Adding task set 66.0 with 1 tasks resource profile 0
18:46:25.347 INFO TaskSetManager - Starting task 0.0 in stage 66.0 (TID 104) (localhost, executor driver, partition 0, ANY, 7853 bytes)
18:46:25.348 INFO Executor - Running task 0.0 in stage 66.0 (TID 104)
18:46:25.369 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram:0+43713
18:46:25.369 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.370 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.371 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.crai dst=null perm=null proto=rpc
18:46:25.371 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.crai dst=null perm=null proto=rpc
18:46:25.373 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
18:46:25.374 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
18:46:25.374 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:25.427 INFO Executor - Finished task 0.0 in stage 66.0 (TID 104). 154101 bytes result sent to driver
18:46:25.428 INFO TaskSetManager - Finished task 0.0 in stage 66.0 (TID 104) in 81 ms on localhost (executor driver) (1/1)
18:46:25.428 INFO TaskSchedulerImpl - Removed TaskSet 66.0, whose tasks have all completed, from pool
18:46:25.428 INFO DAGScheduler - ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.094 s
18:46:25.428 INFO DAGScheduler - Job 49 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:25.428 INFO TaskSchedulerImpl - Killing all running tasks in stage 66: Stage finished
18:46:25.429 INFO DAGScheduler - Job 49 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.095847 s
18:46:25.434 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:25.434 INFO DAGScheduler - Got job 50 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:25.434 INFO DAGScheduler - Final stage: ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185)
18:46:25.434 INFO DAGScheduler - Parents of final stage: List()
18:46:25.434 INFO DAGScheduler - Missing parents: List()
18:46:25.435 INFO DAGScheduler - Submitting ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:25.447 INFO MemoryStore - Block broadcast_120 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
18:46:25.448 INFO MemoryStore - Block broadcast_120_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
18:46:25.448 INFO BlockManagerInfo - Added broadcast_120_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.5 MiB)
18:46:25.448 INFO SparkContext - Created broadcast 120 from broadcast at DAGScheduler.scala:1580
18:46:25.448 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:25.448 INFO TaskSchedulerImpl - Adding task set 67.0 with 1 tasks resource profile 0
18:46:25.449 INFO TaskSetManager - Starting task 0.0 in stage 67.0 (TID 105) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
18:46:25.449 INFO Executor - Running task 0.0 in stage 67.0 (TID 105)
18:46:25.469 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:25.481 INFO Executor - Finished task 0.0 in stage 67.0 (TID 105). 989 bytes result sent to driver
18:46:25.482 INFO TaskSetManager - Finished task 0.0 in stage 67.0 (TID 105) in 32 ms on localhost (executor driver) (1/1)
18:46:25.482 INFO TaskSchedulerImpl - Removed TaskSet 67.0, whose tasks have all completed, from pool
18:46:25.482 INFO DAGScheduler - ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.047 s
18:46:25.482 INFO DAGScheduler - Job 50 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:25.482 INFO TaskSchedulerImpl - Killing all running tasks in stage 67: Stage finished
18:46:25.482 INFO DAGScheduler - Job 50 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.047999 s
18:46:25.485 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:25.486 INFO DAGScheduler - Got job 51 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:25.486 INFO DAGScheduler - Final stage: ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185)
18:46:25.486 INFO DAGScheduler - Parents of final stage: List()
18:46:25.486 INFO DAGScheduler - Missing parents: List()
18:46:25.486 INFO DAGScheduler - Submitting ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:25.503 INFO MemoryStore - Block broadcast_121 stored as values in memory (estimated size 286.8 KiB, free 1917.3 MiB)
18:46:25.505 INFO MemoryStore - Block broadcast_121_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.2 MiB)
18:46:25.505 INFO BlockManagerInfo - Added broadcast_121_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.4 MiB)
18:46:25.505 INFO SparkContext - Created broadcast 121 from broadcast at DAGScheduler.scala:1580
18:46:25.505 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:25.505 INFO TaskSchedulerImpl - Adding task set 68.0 with 1 tasks resource profile 0
18:46:25.506 INFO TaskSetManager - Starting task 0.0 in stage 68.0 (TID 106) (localhost, executor driver, partition 0, ANY, 7853 bytes)
18:46:25.506 INFO Executor - Running task 0.0 in stage 68.0 (TID 106)
18:46:25.531 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram:0+43713
18:46:25.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram dst=null perm=null proto=rpc
18:46:25.533 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.cram.crai dst=null perm=null proto=rpc
18:46:25.534 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_8d6daf1f-a4d0-4e3a-a153-56cb4622ae1a.crai dst=null perm=null proto=rpc
18:46:25.536 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
18:46:25.536 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:25.556 INFO Executor - Finished task 0.0 in stage 68.0 (TID 106). 989 bytes result sent to driver
18:46:25.556 INFO TaskSetManager - Finished task 0.0 in stage 68.0 (TID 106) in 50 ms on localhost (executor driver) (1/1)
18:46:25.556 INFO TaskSchedulerImpl - Removed TaskSet 68.0, whose tasks have all completed, from pool
18:46:25.557 INFO DAGScheduler - ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.070 s
18:46:25.557 INFO DAGScheduler - Job 51 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:25.557 INFO TaskSchedulerImpl - Killing all running tasks in stage 68: Stage finished
18:46:25.557 INFO DAGScheduler - Job 51 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.071344 s
18:46:25.561 INFO MemoryStore - Block broadcast_122 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
18:46:25.567 INFO MemoryStore - Block broadcast_122_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
18:46:25.567 INFO BlockManagerInfo - Added broadcast_122_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:25.568 INFO SparkContext - Created broadcast 122 from newAPIHadoopFile at PathSplitSource.java:96
18:46:25.591 INFO MemoryStore - Block broadcast_123 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
18:46:25.599 INFO BlockManagerInfo - Removed broadcast_117_piece0 on localhost:45727 in memory (size: 187.0 B, free: 1919.3 MiB)
18:46:25.600 INFO BlockManagerInfo - Removed broadcast_115_piece0 on localhost:45727 in memory (size: 107.3 KiB, free: 1919.4 MiB)
18:46:25.601 INFO BlockManagerInfo - Removed broadcast_110_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:25.601 INFO BlockManagerInfo - Removed broadcast_116_piece0 on localhost:45727 in memory (size: 58.1 KiB, free: 1919.5 MiB)
18:46:25.602 INFO BlockManagerInfo - Removed broadcast_114_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.5 MiB)
18:46:25.603 INFO BlockManagerInfo - Removed broadcast_118_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:25.604 INFO BlockManagerInfo - Removed broadcast_113_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.6 MiB)
18:46:25.605 INFO BlockManagerInfo - Removed broadcast_112_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:25.606 INFO BlockManagerInfo - Removed broadcast_109_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.6 MiB)
18:46:25.607 INFO BlockManagerInfo - Removed broadcast_120_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.7 MiB)
18:46:25.607 INFO MemoryStore - Block broadcast_123_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.3 MiB)
18:46:25.607 INFO BlockManagerInfo - Added broadcast_123_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:25.607 INFO SparkContext - Created broadcast 123 from newAPIHadoopFile at PathSplitSource.java:96
18:46:25.608 INFO BlockManagerInfo - Removed broadcast_111_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.7 MiB)
18:46:25.609 INFO BlockManagerInfo - Removed broadcast_119_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.8 MiB)
18:46:25.609 INFO BlockManagerInfo - Removed broadcast_121_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.9 MiB)
18:46:25.632 INFO FileInputFormat - Total input files to process : 1
18:46:25.634 INFO MemoryStore - Block broadcast_124 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
18:46:25.635 INFO MemoryStore - Block broadcast_124_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
18:46:25.635 INFO BlockManagerInfo - Added broadcast_124_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.9 MiB)
18:46:25.636 INFO SparkContext - Created broadcast 124 from broadcast at ReadsSparkSink.java:133
18:46:25.647 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts dst=null perm=null proto=rpc
18:46:25.648 INFO deprecation - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
18:46:25.649 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:25.649 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:25.649 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:25.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:25.657 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:25.658 INFO DAGScheduler - Registering RDD 277 (mapToPair at SparkUtils.java:161) as input to shuffle 15
18:46:25.658 INFO DAGScheduler - Got job 52 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:25.658 INFO DAGScheduler - Final stage: ResultStage 70 (runJob at SparkHadoopWriter.scala:83)
18:46:25.658 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 69)
18:46:25.658 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 69)
18:46:25.659 INFO DAGScheduler - Submitting ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:25.687 INFO MemoryStore - Block broadcast_125 stored as values in memory (estimated size 520.4 KiB, free 1918.6 MiB)
18:46:25.689 INFO MemoryStore - Block broadcast_125_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.5 MiB)
18:46:25.689 INFO BlockManagerInfo - Added broadcast_125_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.7 MiB)
18:46:25.689 INFO SparkContext - Created broadcast 125 from broadcast at DAGScheduler.scala:1580
18:46:25.690 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:25.690 INFO TaskSchedulerImpl - Adding task set 69.0 with 1 tasks resource profile 0
18:46:25.690 INFO TaskSetManager - Starting task 0.0 in stage 69.0 (TID 107) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:25.691 INFO Executor - Running task 0.0 in stage 69.0 (TID 107)
18:46:25.721 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:25.740 INFO Executor - Finished task 0.0 in stage 69.0 (TID 107). 1148 bytes result sent to driver
18:46:25.740 INFO TaskSetManager - Finished task 0.0 in stage 69.0 (TID 107) in 50 ms on localhost (executor driver) (1/1)
18:46:25.740 INFO TaskSchedulerImpl - Removed TaskSet 69.0, whose tasks have all completed, from pool
18:46:25.741 INFO DAGScheduler - ShuffleMapStage 69 (mapToPair at SparkUtils.java:161) finished in 0.082 s
18:46:25.741 INFO DAGScheduler - looking for newly runnable stages
18:46:25.741 INFO DAGScheduler - running: HashSet()
18:46:25.741 INFO DAGScheduler - waiting: HashSet(ResultStage 70)
18:46:25.741 INFO DAGScheduler - failed: HashSet()
18:46:25.741 INFO DAGScheduler - Submitting ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65), which has no missing parents
18:46:25.753 INFO MemoryStore - Block broadcast_126 stored as values in memory (estimated size 241.1 KiB, free 1918.2 MiB)
18:46:25.754 INFO MemoryStore - Block broadcast_126_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.2 MiB)
18:46:25.754 INFO BlockManagerInfo - Added broadcast_126_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.7 MiB)
18:46:25.755 INFO SparkContext - Created broadcast 126 from broadcast at DAGScheduler.scala:1580
18:46:25.755 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
18:46:25.755 INFO TaskSchedulerImpl - Adding task set 70.0 with 1 tasks resource profile 0
18:46:25.755 INFO TaskSetManager - Starting task 0.0 in stage 70.0 (TID 108) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:25.756 INFO Executor - Running task 0.0 in stage 70.0 (TID 108)
18:46:25.761 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:25.761 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:25.777 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:25.777 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:25.777 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:25.778 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/_temporary/attempt_202505191846253812057317817547229_0283_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:25.781 INFO StateChange - BLOCK* allocate blk_1073741869_1045, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/_temporary/attempt_202505191846253812057317817547229_0283_m_000000_0/part-00000
18:46:25.782 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741869_1045 src: /127.0.0.1:58234 dest: /127.0.0.1:38019
18:46:25.789 INFO clienttrace - src: /127.0.0.1:58234, dest: /127.0.0.1:38019, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741869_1045, duration(ns): 5923740
18:46:25.789 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741869_1045, type=LAST_IN_PIPELINE terminating
18:46:25.790 INFO FSNamesystem - BLOCK* blk_1073741869_1045 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/_temporary/attempt_202505191846253812057317817547229_0283_m_000000_0/part-00000
18:46:26.190 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/_temporary/attempt_202505191846253812057317817547229_0283_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:26.191 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/_temporary/attempt_202505191846253812057317817547229_0283_m_000000_0 dst=null perm=null proto=rpc
18:46:26.192 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/_temporary/attempt_202505191846253812057317817547229_0283_m_000000_0 dst=null perm=null proto=rpc
18:46:26.193 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/task_202505191846253812057317817547229_0283_m_000000 dst=null perm=null proto=rpc
18:46:26.193 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/_temporary/attempt_202505191846253812057317817547229_0283_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/task_202505191846253812057317817547229_0283_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:26.193 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846253812057317817547229_0283_m_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/task_202505191846253812057317817547229_0283_m_000000
18:46:26.194 INFO SparkHadoopMapRedUtil - attempt_202505191846253812057317817547229_0283_m_000000_0: Committed. Elapsed time: 1 ms.
18:46:26.194 INFO Executor - Finished task 0.0 in stage 70.0 (TID 108). 1858 bytes result sent to driver
18:46:26.195 INFO TaskSetManager - Finished task 0.0 in stage 70.0 (TID 108) in 440 ms on localhost (executor driver) (1/1)
18:46:26.195 INFO TaskSchedulerImpl - Removed TaskSet 70.0, whose tasks have all completed, from pool
18:46:26.195 INFO DAGScheduler - ResultStage 70 (runJob at SparkHadoopWriter.scala:83) finished in 0.454 s
18:46:26.195 INFO DAGScheduler - Job 52 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:26.195 INFO TaskSchedulerImpl - Killing all running tasks in stage 70: Stage finished
18:46:26.195 INFO DAGScheduler - Job 52 finished: runJob at SparkHadoopWriter.scala:83, took 0.537966 s
18:46:26.196 INFO SparkHadoopWriter - Start to commit write Job job_202505191846253812057317817547229_0283.
18:46:26.196 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:26.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts dst=null perm=null proto=rpc
18:46:26.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/task_202505191846253812057317817547229_0283_m_000000 dst=null perm=null proto=rpc
18:46:26.198 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/part-00000 dst=null perm=null proto=rpc
18:46:26.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary/0/task_202505191846253812057317817547229_0283_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:26.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_temporary dst=null perm=null proto=rpc
18:46:26.200 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:26.201 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:26.202 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/.spark-staging-283 dst=null perm=null proto=rpc
18:46:26.202 INFO SparkHadoopWriter - Write Job job_202505191846253812057317817547229_0283 committed. Elapsed time: 5 ms.
18:46:26.202 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:26.206 INFO StateChange - BLOCK* allocate blk_1073741870_1046, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/header
18:46:26.207 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741870_1046 src: /127.0.0.1:58248 dest: /127.0.0.1:38019
18:46:26.208 INFO clienttrace - src: /127.0.0.1:58248, dest: /127.0.0.1:38019, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741870_1046, duration(ns): 618073
18:46:26.208 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741870_1046, type=LAST_IN_PIPELINE terminating
18:46:26.208 INFO FSNamesystem - BLOCK* blk_1073741870_1046 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/header
18:46:26.609 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:26.610 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts dst=null perm=null proto=rpc
18:46:26.612 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:26.613 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:26.613 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam
18:46:26.613 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:26.614 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.615 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:26.615 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam done
18:46:26.615 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam.parts dst=null perm=null proto=rpc
18:46:26.616 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.616 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.617 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.617 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
WARNING 2025-05-19 18:46:26 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:26.620 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
18:46:26.621 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
18:46:26.621 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.622 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.622 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
WARNING 2025-05-19 18:46:26 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:26.625 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
18:46:26.627 INFO MemoryStore - Block broadcast_127 stored as values in memory (estimated size 160.7 KiB, free 1918.0 MiB)
18:46:26.628 INFO MemoryStore - Block broadcast_127_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.0 MiB)
18:46:26.628 INFO BlockManagerInfo - Added broadcast_127_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.7 MiB)
18:46:26.628 INFO SparkContext - Created broadcast 127 from broadcast at SamSource.java:78
18:46:26.630 INFO MemoryStore - Block broadcast_128 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
18:46:26.640 INFO MemoryStore - Block broadcast_128_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
18:46:26.640 INFO BlockManagerInfo - Added broadcast_128_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:26.640 INFO SparkContext - Created broadcast 128 from newAPIHadoopFile at SamSource.java:108
18:46:26.647 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.648 INFO FileInputFormat - Total input files to process : 1
18:46:26.648 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.661 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:26.662 INFO DAGScheduler - Got job 53 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:26.662 INFO DAGScheduler - Final stage: ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:26.662 INFO DAGScheduler - Parents of final stage: List()
18:46:26.662 INFO DAGScheduler - Missing parents: List()
18:46:26.662 INFO DAGScheduler - Submitting ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:26.663 INFO MemoryStore - Block broadcast_129 stored as values in memory (estimated size 7.5 KiB, free 1917.7 MiB)
18:46:26.663 INFO MemoryStore - Block broadcast_129_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1917.7 MiB)
18:46:26.663 INFO BlockManagerInfo - Added broadcast_129_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.6 MiB)
18:46:26.664 INFO SparkContext - Created broadcast 129 from broadcast at DAGScheduler.scala:1580
18:46:26.664 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:26.664 INFO TaskSchedulerImpl - Adding task set 71.0 with 1 tasks resource profile 0
18:46:26.664 INFO TaskSetManager - Starting task 0.0 in stage 71.0 (TID 109) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:26.665 INFO Executor - Running task 0.0 in stage 71.0 (TID 109)
18:46:26.666 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam:0+847558
18:46:26.671 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.709 INFO Executor - Finished task 0.0 in stage 71.0 (TID 109). 651526 bytes result sent to driver
18:46:26.710 INFO TaskSetManager - Finished task 0.0 in stage 71.0 (TID 109) in 46 ms on localhost (executor driver) (1/1)
18:46:26.710 INFO TaskSchedulerImpl - Removed TaskSet 71.0, whose tasks have all completed, from pool
18:46:26.710 INFO DAGScheduler - ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.048 s
18:46:26.711 INFO DAGScheduler - Job 53 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:26.711 INFO TaskSchedulerImpl - Killing all running tasks in stage 71: Stage finished
18:46:26.711 INFO DAGScheduler - Job 53 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.049320 s
18:46:26.720 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:26.721 INFO DAGScheduler - Got job 54 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:26.721 INFO DAGScheduler - Final stage: ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185)
18:46:26.721 INFO DAGScheduler - Parents of final stage: List()
18:46:26.721 INFO DAGScheduler - Missing parents: List()
18:46:26.721 INFO DAGScheduler - Submitting ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:26.738 INFO MemoryStore - Block broadcast_130 stored as values in memory (estimated size 426.1 KiB, free 1917.2 MiB)
18:46:26.739 INFO MemoryStore - Block broadcast_130_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.1 MiB)
18:46:26.739 INFO BlockManagerInfo - Added broadcast_130_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:26.739 INFO SparkContext - Created broadcast 130 from broadcast at DAGScheduler.scala:1580
18:46:26.739 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:26.740 INFO TaskSchedulerImpl - Adding task set 72.0 with 1 tasks resource profile 0
18:46:26.740 INFO TaskSetManager - Starting task 0.0 in stage 72.0 (TID 110) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:26.740 INFO Executor - Running task 0.0 in stage 72.0 (TID 110)
18:46:26.770 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:26.780 INFO Executor - Finished task 0.0 in stage 72.0 (TID 110). 989 bytes result sent to driver
18:46:26.780 INFO TaskSetManager - Finished task 0.0 in stage 72.0 (TID 110) in 40 ms on localhost (executor driver) (1/1)
18:46:26.780 INFO TaskSchedulerImpl - Removed TaskSet 72.0, whose tasks have all completed, from pool
18:46:26.780 INFO DAGScheduler - ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
18:46:26.781 INFO DAGScheduler - Job 54 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:26.781 INFO TaskSchedulerImpl - Killing all running tasks in stage 72: Stage finished
18:46:26.781 INFO DAGScheduler - Job 54 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060363 s
18:46:26.784 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:26.784 INFO DAGScheduler - Got job 55 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:26.785 INFO DAGScheduler - Final stage: ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185)
18:46:26.785 INFO DAGScheduler - Parents of final stage: List()
18:46:26.785 INFO DAGScheduler - Missing parents: List()
18:46:26.785 INFO DAGScheduler - Submitting ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:26.786 INFO MemoryStore - Block broadcast_131 stored as values in memory (estimated size 7.4 KiB, free 1917.1 MiB)
18:46:26.786 INFO MemoryStore - Block broadcast_131_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1917.1 MiB)
18:46:26.786 INFO BlockManagerInfo - Added broadcast_131_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.4 MiB)
18:46:26.787 INFO SparkContext - Created broadcast 131 from broadcast at DAGScheduler.scala:1580
18:46:26.787 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:26.787 INFO TaskSchedulerImpl - Adding task set 73.0 with 1 tasks resource profile 0
18:46:26.787 INFO TaskSetManager - Starting task 0.0 in stage 73.0 (TID 111) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:26.788 INFO Executor - Running task 0.0 in stage 73.0 (TID 111)
18:46:26.789 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam:0+847558
18:46:26.791 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_0f82b77a-972f-46fa-afbb-0281bd763c70.sam dst=null perm=null proto=rpc
18:46:26.809 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
18:46:26.811 INFO Executor - Finished task 0.0 in stage 73.0 (TID 111). 989 bytes result sent to driver
18:46:26.811 INFO TaskSetManager - Finished task 0.0 in stage 73.0 (TID 111) in 24 ms on localhost (executor driver) (1/1)
18:46:26.811 INFO TaskSchedulerImpl - Removed TaskSet 73.0, whose tasks have all completed, from pool
18:46:26.812 INFO DAGScheduler - ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.027 s
18:46:26.812 INFO DAGScheduler - Job 55 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:26.812 INFO TaskSchedulerImpl - Killing all running tasks in stage 73: Stage finished
18:46:26.812 INFO DAGScheduler - Job 55 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027633 s
18:46:26.815 INFO MemoryStore - Block broadcast_132 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
18:46:26.821 INFO MemoryStore - Block broadcast_132_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
18:46:26.822 INFO BlockManagerInfo - Added broadcast_132_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:26.822 INFO SparkContext - Created broadcast 132 from newAPIHadoopFile at PathSplitSource.java:96
18:46:26.847 INFO MemoryStore - Block broadcast_133 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
18:46:26.853 INFO MemoryStore - Block broadcast_133_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:26.853 INFO BlockManagerInfo - Added broadcast_133_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:26.854 INFO SparkContext - Created broadcast 133 from newAPIHadoopFile at PathSplitSource.java:96
18:46:26.874 INFO MemoryStore - Block broadcast_134 stored as values in memory (estimated size 160.7 KiB, free 1916.3 MiB)
18:46:26.875 INFO MemoryStore - Block broadcast_134_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
18:46:26.875 INFO BlockManagerInfo - Added broadcast_134_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:26.876 INFO SparkContext - Created broadcast 134 from broadcast at ReadsSparkSink.java:133
18:46:26.878 INFO MemoryStore - Block broadcast_135 stored as values in memory (estimated size 163.2 KiB, free 1916.1 MiB)
18:46:26.879 INFO MemoryStore - Block broadcast_135_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.1 MiB)
18:46:26.879 INFO BlockManagerInfo - Added broadcast_135_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:26.879 INFO SparkContext - Created broadcast 135 from broadcast at AnySamSinkMultiple.java:80
18:46:26.883 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:26.883 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:26.883 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:26.896 INFO FileInputFormat - Total input files to process : 1
18:46:26.905 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:26.905 INFO DAGScheduler - Registering RDD 296 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 16
18:46:26.905 INFO DAGScheduler - Got job 56 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:26.905 INFO DAGScheduler - Final stage: ResultStage 75 (runJob at SparkHadoopWriter.scala:83)
18:46:26.905 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 74)
18:46:26.905 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 74)
18:46:26.906 INFO DAGScheduler - Submitting ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:26.923 INFO MemoryStore - Block broadcast_136 stored as values in memory (estimated size 427.7 KiB, free 1915.7 MiB)
18:46:26.925 INFO MemoryStore - Block broadcast_136_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1915.5 MiB)
18:46:26.925 INFO BlockManagerInfo - Added broadcast_136_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.2 MiB)
18:46:26.925 INFO SparkContext - Created broadcast 136 from broadcast at DAGScheduler.scala:1580
18:46:26.925 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:26.926 INFO TaskSchedulerImpl - Adding task set 74.0 with 1 tasks resource profile 0
18:46:26.926 INFO TaskSetManager - Starting task 0.0 in stage 74.0 (TID 112) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:26.927 INFO Executor - Running task 0.0 in stage 74.0 (TID 112)
18:46:26.957 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:26.976 INFO Executor - Finished task 0.0 in stage 74.0 (TID 112). 1149 bytes result sent to driver
18:46:26.977 INFO TaskSetManager - Finished task 0.0 in stage 74.0 (TID 112) in 51 ms on localhost (executor driver) (1/1)
18:46:26.977 INFO TaskSchedulerImpl - Removed TaskSet 74.0, whose tasks have all completed, from pool
18:46:26.977 INFO DAGScheduler - ShuffleMapStage 74 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.071 s
18:46:26.977 INFO DAGScheduler - looking for newly runnable stages
18:46:26.977 INFO DAGScheduler - running: HashSet()
18:46:26.977 INFO DAGScheduler - waiting: HashSet(ResultStage 75)
18:46:26.977 INFO DAGScheduler - failed: HashSet()
18:46:26.977 INFO DAGScheduler - Submitting ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:26.989 INFO MemoryStore - Block broadcast_137 stored as values in memory (estimated size 150.2 KiB, free 1915.4 MiB)
18:46:26.997 INFO BlockManagerInfo - Removed broadcast_127_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.2 MiB)
18:46:26.997 INFO MemoryStore - Block broadcast_137_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1915.3 MiB)
18:46:26.997 INFO BlockManagerInfo - Added broadcast_137_piece0 in memory on localhost:45727 (size: 56.2 KiB, free: 1919.1 MiB)
18:46:26.997 INFO SparkContext - Created broadcast 137 from broadcast at DAGScheduler.scala:1580
18:46:26.998 INFO BlockManagerInfo - Removed broadcast_131_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1919.1 MiB)
18:46:26.998 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:26.998 INFO TaskSchedulerImpl - Adding task set 75.0 with 2 tasks resource profile 0
18:46:26.998 INFO BlockManagerInfo - Removed broadcast_122_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:26.999 INFO BlockManagerInfo - Removed broadcast_130_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:26.999 INFO TaskSetManager - Starting task 0.0 in stage 75.0 (TID 113) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:26.999 INFO TaskSetManager - Starting task 1.0 in stage 75.0 (TID 114) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:26.999 INFO Executor - Running task 0.0 in stage 75.0 (TID 113)
18:46:27.000 INFO Executor - Running task 1.0 in stage 75.0 (TID 114)
18:46:27.000 INFO BlockManagerInfo - Removed broadcast_124_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:27.002 INFO BlockManagerInfo - Removed broadcast_126_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.4 MiB)
18:46:27.003 INFO BlockManagerInfo - Removed broadcast_123_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:27.004 INFO BlockManagerInfo - Removed broadcast_125_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.6 MiB)
18:46:27.004 INFO BlockManagerInfo - Removed broadcast_133_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:27.005 INFO BlockManagerInfo - Removed broadcast_128_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:27.006 INFO BlockManagerInfo - Removed broadcast_129_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1919.7 MiB)
18:46:27.007 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.007 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.007 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.007 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.007 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.007 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.008 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.008 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.008 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.008 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.008 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.008 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.019 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.019 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.022 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.022 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.030 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846261078455936508044972_0308_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest17611298248978503407.bam/_temporary/0/task_202505191846261078455936508044972_0308_r_000001
18:46:27.030 INFO SparkHadoopMapRedUtil - attempt_202505191846261078455936508044972_0308_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:27.031 INFO Executor - Finished task 1.0 in stage 75.0 (TID 114). 1729 bytes result sent to driver
18:46:27.031 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846261078455936508044972_0308_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest17611298248978503407.bam/_temporary/0/task_202505191846261078455936508044972_0308_r_000000
18:46:27.031 INFO SparkHadoopMapRedUtil - attempt_202505191846261078455936508044972_0308_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:27.031 INFO Executor - Finished task 0.0 in stage 75.0 (TID 113). 1729 bytes result sent to driver
18:46:27.031 INFO TaskSetManager - Finished task 1.0 in stage 75.0 (TID 114) in 32 ms on localhost (executor driver) (1/2)
18:46:27.031 INFO TaskSetManager - Finished task 0.0 in stage 75.0 (TID 113) in 32 ms on localhost (executor driver) (2/2)
18:46:27.032 INFO TaskSchedulerImpl - Removed TaskSet 75.0, whose tasks have all completed, from pool
18:46:27.032 INFO DAGScheduler - ResultStage 75 (runJob at SparkHadoopWriter.scala:83) finished in 0.054 s
18:46:27.032 INFO DAGScheduler - Job 56 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.032 INFO TaskSchedulerImpl - Killing all running tasks in stage 75: Stage finished
18:46:27.032 INFO DAGScheduler - Job 56 finished: runJob at SparkHadoopWriter.scala:83, took 0.127280 s
18:46:27.033 INFO SparkHadoopWriter - Start to commit write Job job_202505191846261078455936508044972_0308.
18:46:27.039 INFO SparkHadoopWriter - Write Job job_202505191846261078455936508044972_0308 committed. Elapsed time: 5 ms.
18:46:27.042 INFO MemoryStore - Block broadcast_138 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
18:46:27.053 INFO MemoryStore - Block broadcast_138_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
18:46:27.053 INFO BlockManagerInfo - Added broadcast_138_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:27.054 INFO SparkContext - Created broadcast 138 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.079 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:27.079 INFO DAGScheduler - Got job 57 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:27.079 INFO DAGScheduler - Final stage: ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222)
18:46:27.079 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 76)
18:46:27.079 INFO DAGScheduler - Missing parents: List()
18:46:27.080 INFO DAGScheduler - Submitting ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:27.081 INFO MemoryStore - Block broadcast_139 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
18:46:27.081 INFO MemoryStore - Block broadcast_139_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
18:46:27.081 INFO BlockManagerInfo - Added broadcast_139_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.7 MiB)
18:46:27.082 INFO SparkContext - Created broadcast 139 from broadcast at DAGScheduler.scala:1580
18:46:27.082 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.082 INFO TaskSchedulerImpl - Adding task set 77.0 with 2 tasks resource profile 0
18:46:27.083 INFO TaskSetManager - Starting task 0.0 in stage 77.0 (TID 115) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:27.083 INFO TaskSetManager - Starting task 1.0 in stage 77.0 (TID 116) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:27.083 INFO Executor - Running task 1.0 in stage 77.0 (TID 116)
18:46:27.083 INFO Executor - Running task 0.0 in stage 77.0 (TID 115)
18:46:27.085 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.085 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.085 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.085 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.091 INFO Executor - Finished task 1.0 in stage 77.0 (TID 116). 1634 bytes result sent to driver
18:46:27.091 INFO TaskSetManager - Finished task 1.0 in stage 77.0 (TID 116) in 8 ms on localhost (executor driver) (1/2)
18:46:27.092 INFO Executor - Finished task 0.0 in stage 77.0 (TID 115). 1591 bytes result sent to driver
18:46:27.092 INFO TaskSetManager - Finished task 0.0 in stage 77.0 (TID 115) in 10 ms on localhost (executor driver) (2/2)
18:46:27.093 INFO TaskSchedulerImpl - Removed TaskSet 77.0, whose tasks have all completed, from pool
18:46:27.094 INFO DAGScheduler - ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.014 s
18:46:27.094 INFO DAGScheduler - Job 57 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.094 INFO TaskSchedulerImpl - Killing all running tasks in stage 77: Stage finished
18:46:27.094 INFO DAGScheduler - Job 57 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.015027 s
18:46:27.111 INFO FileInputFormat - Total input files to process : 2
18:46:27.115 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:27.116 INFO DAGScheduler - Got job 58 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:27.116 INFO DAGScheduler - Final stage: ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222)
18:46:27.116 INFO DAGScheduler - Parents of final stage: List()
18:46:27.116 INFO DAGScheduler - Missing parents: List()
18:46:27.116 INFO DAGScheduler - Submitting ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:27.134 INFO MemoryStore - Block broadcast_140 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
18:46:27.135 INFO MemoryStore - Block broadcast_140_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
18:46:27.135 INFO BlockManagerInfo - Added broadcast_140_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:27.135 INFO SparkContext - Created broadcast 140 from broadcast at DAGScheduler.scala:1580
18:46:27.135 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.135 INFO TaskSchedulerImpl - Adding task set 78.0 with 2 tasks resource profile 0
18:46:27.136 INFO TaskSetManager - Starting task 0.0 in stage 78.0 (TID 117) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
18:46:27.136 INFO TaskSetManager - Starting task 1.0 in stage 78.0 (TID 118) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
18:46:27.137 INFO Executor - Running task 0.0 in stage 78.0 (TID 117)
18:46:27.137 INFO Executor - Running task 1.0 in stage 78.0 (TID 118)
18:46:27.167 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17611298248978503407.bam/part-r-00001.bam:0+129330
18:46:27.176 INFO Executor - Finished task 0.0 in stage 78.0 (TID 117). 989 bytes result sent to driver
18:46:27.177 INFO TaskSetManager - Finished task 0.0 in stage 78.0 (TID 117) in 41 ms on localhost (executor driver) (1/2)
18:46:27.182 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17611298248978503407.bam/part-r-00000.bam:0+132492
18:46:27.195 INFO Executor - Finished task 1.0 in stage 78.0 (TID 118). 989 bytes result sent to driver
18:46:27.195 INFO TaskSetManager - Finished task 1.0 in stage 78.0 (TID 118) in 59 ms on localhost (executor driver) (2/2)
18:46:27.195 INFO TaskSchedulerImpl - Removed TaskSet 78.0, whose tasks have all completed, from pool
18:46:27.195 INFO DAGScheduler - ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.079 s
18:46:27.196 INFO DAGScheduler - Job 58 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.196 INFO TaskSchedulerImpl - Killing all running tasks in stage 78: Stage finished
18:46:27.196 INFO DAGScheduler - Job 58 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.081042 s
18:46:27.199 INFO MemoryStore - Block broadcast_141 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
18:46:27.205 INFO MemoryStore - Block broadcast_141_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
18:46:27.205 INFO BlockManagerInfo - Added broadcast_141_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:27.205 INFO SparkContext - Created broadcast 141 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.229 INFO MemoryStore - Block broadcast_142 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
18:46:27.235 INFO MemoryStore - Block broadcast_142_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
18:46:27.235 INFO BlockManagerInfo - Added broadcast_142_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:27.236 INFO SparkContext - Created broadcast 142 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.256 INFO MemoryStore - Block broadcast_143 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
18:46:27.257 INFO MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
18:46:27.257 INFO BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.4 MiB)
18:46:27.257 INFO SparkContext - Created broadcast 143 from broadcast at ReadsSparkSink.java:133
18:46:27.259 INFO MemoryStore - Block broadcast_144 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
18:46:27.259 INFO MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
18:46:27.259 INFO BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.4 MiB)
18:46:27.260 INFO SparkContext - Created broadcast 144 from broadcast at AnySamSinkMultiple.java:80
18:46:27.262 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.262 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.262 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.274 INFO FileInputFormat - Total input files to process : 1
18:46:27.281 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:27.282 INFO DAGScheduler - Registering RDD 323 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 17
18:46:27.282 INFO DAGScheduler - Got job 59 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:27.282 INFO DAGScheduler - Final stage: ResultStage 80 (runJob at SparkHadoopWriter.scala:83)
18:46:27.282 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 79)
18:46:27.282 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 79)
18:46:27.282 INFO DAGScheduler - Submitting ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:27.302 INFO MemoryStore - Block broadcast_145 stored as values in memory (estimated size 427.7 KiB, free 1916.2 MiB)
18:46:27.303 INFO MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.1 MiB)
18:46:27.303 INFO BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.3 MiB)
18:46:27.303 INFO SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1580
18:46:27.304 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:27.304 INFO TaskSchedulerImpl - Adding task set 79.0 with 1 tasks resource profile 0
18:46:27.304 INFO TaskSetManager - Starting task 0.0 in stage 79.0 (TID 119) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:27.305 INFO Executor - Running task 0.0 in stage 79.0 (TID 119)
18:46:27.334 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:27.352 INFO Executor - Finished task 0.0 in stage 79.0 (TID 119). 1149 bytes result sent to driver
18:46:27.352 INFO TaskSetManager - Finished task 0.0 in stage 79.0 (TID 119) in 48 ms on localhost (executor driver) (1/1)
18:46:27.352 INFO TaskSchedulerImpl - Removed TaskSet 79.0, whose tasks have all completed, from pool
18:46:27.353 INFO DAGScheduler - ShuffleMapStage 79 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.070 s
18:46:27.353 INFO DAGScheduler - looking for newly runnable stages
18:46:27.353 INFO DAGScheduler - running: HashSet()
18:46:27.353 INFO DAGScheduler - waiting: HashSet(ResultStage 80)
18:46:27.353 INFO DAGScheduler - failed: HashSet()
18:46:27.353 INFO DAGScheduler - Submitting ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:27.364 INFO MemoryStore - Block broadcast_146 stored as values in memory (estimated size 150.2 KiB, free 1915.9 MiB)
18:46:27.365 INFO MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1915.9 MiB)
18:46:27.365 INFO BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:45727 (size: 56.2 KiB, free: 1919.2 MiB)
18:46:27.365 INFO SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1580
18:46:27.366 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.366 INFO TaskSchedulerImpl - Adding task set 80.0 with 2 tasks resource profile 0
18:46:27.366 INFO TaskSetManager - Starting task 0.0 in stage 80.0 (TID 120) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:27.366 INFO TaskSetManager - Starting task 1.0 in stage 80.0 (TID 121) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:27.367 INFO Executor - Running task 0.0 in stage 80.0 (TID 120)
18:46:27.367 INFO Executor - Running task 1.0 in stage 80.0 (TID 121)
18:46:27.373 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.373 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.373 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.373 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.373 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.373 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.373 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.373 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.373 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.373 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.373 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.373 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.382 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.382 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.388 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.388 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.390 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846273715991776619667163_0335_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest13361135021015566297.bam/_temporary/0/task_202505191846273715991776619667163_0335_r_000000
18:46:27.390 INFO SparkHadoopMapRedUtil - attempt_202505191846273715991776619667163_0335_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:27.391 INFO Executor - Finished task 0.0 in stage 80.0 (TID 120). 1729 bytes result sent to driver
18:46:27.391 INFO TaskSetManager - Finished task 0.0 in stage 80.0 (TID 120) in 25 ms on localhost (executor driver) (1/2)
18:46:27.397 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846273715991776619667163_0335_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest13361135021015566297.bam/_temporary/0/task_202505191846273715991776619667163_0335_r_000001
18:46:27.397 INFO SparkHadoopMapRedUtil - attempt_202505191846273715991776619667163_0335_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:27.397 INFO Executor - Finished task 1.0 in stage 80.0 (TID 121). 1729 bytes result sent to driver
18:46:27.398 INFO TaskSetManager - Finished task 1.0 in stage 80.0 (TID 121) in 32 ms on localhost (executor driver) (2/2)
18:46:27.398 INFO TaskSchedulerImpl - Removed TaskSet 80.0, whose tasks have all completed, from pool
18:46:27.398 INFO DAGScheduler - ResultStage 80 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
18:46:27.398 INFO DAGScheduler - Job 59 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.398 INFO TaskSchedulerImpl - Killing all running tasks in stage 80: Stage finished
18:46:27.398 INFO DAGScheduler - Job 59 finished: runJob at SparkHadoopWriter.scala:83, took 0.116907 s
18:46:27.398 INFO SparkHadoopWriter - Start to commit write Job job_202505191846273715991776619667163_0335.
18:46:27.404 INFO SparkHadoopWriter - Write Job job_202505191846273715991776619667163_0335 committed. Elapsed time: 5 ms.
18:46:27.407 INFO MemoryStore - Block broadcast_147 stored as values in memory (estimated size 297.9 KiB, free 1915.6 MiB)
18:46:27.419 INFO MemoryStore - Block broadcast_147_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.5 MiB)
18:46:27.419 INFO BlockManagerInfo - Added broadcast_147_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:27.420 INFO SparkContext - Created broadcast 147 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.451 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:27.452 INFO DAGScheduler - Got job 60 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:27.452 INFO DAGScheduler - Final stage: ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222)
18:46:27.452 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 81)
18:46:27.452 INFO DAGScheduler - Missing parents: List()
18:46:27.452 INFO DAGScheduler - Submitting ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:27.453 INFO MemoryStore - Block broadcast_148 stored as values in memory (estimated size 6.3 KiB, free 1915.5 MiB)
18:46:27.453 INFO MemoryStore - Block broadcast_148_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1915.5 MiB)
18:46:27.454 INFO BlockManagerInfo - Added broadcast_148_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.1 MiB)
18:46:27.454 INFO SparkContext - Created broadcast 148 from broadcast at DAGScheduler.scala:1580
18:46:27.454 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.454 INFO TaskSchedulerImpl - Adding task set 82.0 with 2 tasks resource profile 0
18:46:27.455 INFO TaskSetManager - Starting task 0.0 in stage 82.0 (TID 122) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:27.455 INFO TaskSetManager - Starting task 1.0 in stage 82.0 (TID 123) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:27.456 INFO Executor - Running task 1.0 in stage 82.0 (TID 123)
18:46:27.456 INFO Executor - Running task 0.0 in stage 82.0 (TID 122)
18:46:27.457 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.457 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.457 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.457 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.462 INFO Executor - Finished task 0.0 in stage 82.0 (TID 122). 1591 bytes result sent to driver
18:46:27.462 INFO Executor - Finished task 1.0 in stage 82.0 (TID 123). 1591 bytes result sent to driver
18:46:27.462 INFO TaskSetManager - Finished task 0.0 in stage 82.0 (TID 122) in 7 ms on localhost (executor driver) (1/2)
18:46:27.462 INFO TaskSetManager - Finished task 1.0 in stage 82.0 (TID 123) in 7 ms on localhost (executor driver) (2/2)
18:46:27.462 INFO TaskSchedulerImpl - Removed TaskSet 82.0, whose tasks have all completed, from pool
18:46:27.462 INFO DAGScheduler - ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
18:46:27.463 INFO DAGScheduler - Job 60 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.463 INFO TaskSchedulerImpl - Killing all running tasks in stage 82: Stage finished
18:46:27.463 INFO DAGScheduler - Job 60 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011370 s
18:46:27.475 INFO FileInputFormat - Total input files to process : 2
18:46:27.480 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:27.480 INFO DAGScheduler - Got job 61 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:27.480 INFO DAGScheduler - Final stage: ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222)
18:46:27.480 INFO DAGScheduler - Parents of final stage: List()
18:46:27.480 INFO DAGScheduler - Missing parents: List()
18:46:27.480 INFO DAGScheduler - Submitting ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:27.498 INFO MemoryStore - Block broadcast_149 stored as values in memory (estimated size 426.1 KiB, free 1915.1 MiB)
18:46:27.507 INFO BlockManagerInfo - Removed broadcast_138_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:27.507 INFO MemoryStore - Block broadcast_149_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.5 MiB)
18:46:27.507 INFO BlockManagerInfo - Removed broadcast_146_piece0 on localhost:45727 in memory (size: 56.2 KiB, free: 1919.3 MiB)
18:46:27.508 INFO BlockManagerInfo - Added broadcast_149_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.1 MiB)
18:46:27.508 INFO SparkContext - Created broadcast 149 from broadcast at DAGScheduler.scala:1580
18:46:27.508 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.508 INFO TaskSchedulerImpl - Adding task set 83.0 with 2 tasks resource profile 0
18:46:27.509 INFO BlockManagerInfo - Removed broadcast_135_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.1 MiB)
18:46:27.509 INFO TaskSetManager - Starting task 0.0 in stage 83.0 (TID 124) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
18:46:27.509 INFO BlockManagerInfo - Removed broadcast_140_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:27.509 INFO TaskSetManager - Starting task 1.0 in stage 83.0 (TID 125) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
18:46:27.510 INFO Executor - Running task 0.0 in stage 83.0 (TID 124)
18:46:27.510 INFO BlockManagerInfo - Removed broadcast_143_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:27.510 INFO Executor - Running task 1.0 in stage 83.0 (TID 125)
18:46:27.513 INFO BlockManagerInfo - Removed broadcast_136_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.4 MiB)
18:46:27.513 INFO BlockManagerInfo - Removed broadcast_148_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.4 MiB)
18:46:27.515 INFO BlockManagerInfo - Removed broadcast_137_piece0 on localhost:45727 in memory (size: 56.2 KiB, free: 1919.5 MiB)
18:46:27.516 INFO BlockManagerInfo - Removed broadcast_139_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.5 MiB)
18:46:27.516 INFO BlockManagerInfo - Removed broadcast_132_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:27.518 INFO BlockManagerInfo - Removed broadcast_144_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:27.519 INFO BlockManagerInfo - Removed broadcast_142_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:27.519 INFO BlockManagerInfo - Removed broadcast_134_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:27.520 INFO BlockManagerInfo - Removed broadcast_145_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.8 MiB)
18:46:27.545 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13361135021015566297.bam/part-r-00001.bam:0+129330
18:46:27.555 INFO Executor - Finished task 0.0 in stage 83.0 (TID 124). 989 bytes result sent to driver
18:46:27.555 INFO TaskSetManager - Finished task 0.0 in stage 83.0 (TID 124) in 46 ms on localhost (executor driver) (1/2)
18:46:27.556 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13361135021015566297.bam/part-r-00000.bam:0+132492
18:46:27.569 INFO Executor - Finished task 1.0 in stage 83.0 (TID 125). 989 bytes result sent to driver
18:46:27.569 INFO TaskSetManager - Finished task 1.0 in stage 83.0 (TID 125) in 60 ms on localhost (executor driver) (2/2)
18:46:27.569 INFO TaskSchedulerImpl - Removed TaskSet 83.0, whose tasks have all completed, from pool
18:46:27.570 INFO DAGScheduler - ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.089 s
18:46:27.570 INFO DAGScheduler - Job 61 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.570 INFO TaskSchedulerImpl - Killing all running tasks in stage 83: Stage finished
18:46:27.570 INFO DAGScheduler - Job 61 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.090383 s
18:46:27.574 INFO MemoryStore - Block broadcast_150 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
18:46:27.580 INFO MemoryStore - Block broadcast_150_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
18:46:27.580 INFO BlockManagerInfo - Added broadcast_150_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:27.581 INFO SparkContext - Created broadcast 150 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.604 INFO MemoryStore - Block broadcast_151 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:27.611 INFO MemoryStore - Block broadcast_151_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
18:46:27.611 INFO BlockManagerInfo - Added broadcast_151_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:27.611 INFO SparkContext - Created broadcast 151 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.632 INFO MemoryStore - Block broadcast_152 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
18:46:27.632 INFO MemoryStore - Block broadcast_152_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
18:46:27.633 INFO BlockManagerInfo - Added broadcast_152_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:27.633 INFO SparkContext - Created broadcast 152 from broadcast at ReadsSparkSink.java:133
18:46:27.634 INFO MemoryStore - Block broadcast_153 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
18:46:27.635 INFO MemoryStore - Block broadcast_153_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
18:46:27.635 INFO BlockManagerInfo - Added broadcast_153_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:27.635 INFO SparkContext - Created broadcast 153 from broadcast at AnySamSinkMultiple.java:80
18:46:27.637 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.637 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.637 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.649 INFO FileInputFormat - Total input files to process : 1
18:46:27.661 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:27.661 INFO DAGScheduler - Registering RDD 350 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 18
18:46:27.661 INFO DAGScheduler - Got job 62 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:27.661 INFO DAGScheduler - Final stage: ResultStage 85 (runJob at SparkHadoopWriter.scala:83)
18:46:27.661 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 84)
18:46:27.661 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 84)
18:46:27.662 INFO DAGScheduler - Submitting ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:27.679 INFO MemoryStore - Block broadcast_154 stored as values in memory (estimated size 427.7 KiB, free 1917.3 MiB)
18:46:27.680 INFO MemoryStore - Block broadcast_154_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.2 MiB)
18:46:27.680 INFO BlockManagerInfo - Added broadcast_154_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.5 MiB)
18:46:27.680 INFO SparkContext - Created broadcast 154 from broadcast at DAGScheduler.scala:1580
18:46:27.681 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:27.681 INFO TaskSchedulerImpl - Adding task set 84.0 with 1 tasks resource profile 0
18:46:27.681 INFO TaskSetManager - Starting task 0.0 in stage 84.0 (TID 126) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:27.682 INFO Executor - Running task 0.0 in stage 84.0 (TID 126)
18:46:27.712 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:27.732 INFO Executor - Finished task 0.0 in stage 84.0 (TID 126). 1149 bytes result sent to driver
18:46:27.733 INFO TaskSetManager - Finished task 0.0 in stage 84.0 (TID 126) in 52 ms on localhost (executor driver) (1/1)
18:46:27.733 INFO TaskSchedulerImpl - Removed TaskSet 84.0, whose tasks have all completed, from pool
18:46:27.733 INFO DAGScheduler - ShuffleMapStage 84 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.071 s
18:46:27.733 INFO DAGScheduler - looking for newly runnable stages
18:46:27.733 INFO DAGScheduler - running: HashSet()
18:46:27.733 INFO DAGScheduler - waiting: HashSet(ResultStage 85)
18:46:27.733 INFO DAGScheduler - failed: HashSet()
18:46:27.733 INFO DAGScheduler - Submitting ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:27.741 INFO MemoryStore - Block broadcast_155 stored as values in memory (estimated size 150.2 KiB, free 1917.0 MiB)
18:46:27.742 INFO MemoryStore - Block broadcast_155_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1917.0 MiB)
18:46:27.742 INFO BlockManagerInfo - Added broadcast_155_piece0 in memory on localhost:45727 (size: 56.2 KiB, free: 1919.4 MiB)
18:46:27.742 INFO SparkContext - Created broadcast 155 from broadcast at DAGScheduler.scala:1580
18:46:27.742 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.742 INFO TaskSchedulerImpl - Adding task set 85.0 with 2 tasks resource profile 0
18:46:27.743 INFO TaskSetManager - Starting task 0.0 in stage 85.0 (TID 127) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:27.743 INFO TaskSetManager - Starting task 1.0 in stage 85.0 (TID 128) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:27.744 INFO Executor - Running task 0.0 in stage 85.0 (TID 127)
18:46:27.744 INFO Executor - Running task 1.0 in stage 85.0 (TID 128)
18:46:27.748 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.748 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.748 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.748 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.748 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.748 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.750 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.750 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.750 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.750 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:27.750 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:27.750 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:27.761 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.762 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.763 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.763 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.771 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846279021867769478948608_0362_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest19679429106441831932.bam/_temporary/0/task_202505191846279021867769478948608_0362_r_000001
18:46:27.771 INFO SparkHadoopMapRedUtil - attempt_202505191846279021867769478948608_0362_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:27.771 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846279021867769478948608_0362_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest19679429106441831932.bam/_temporary/0/task_202505191846279021867769478948608_0362_r_000000
18:46:27.771 INFO SparkHadoopMapRedUtil - attempt_202505191846279021867769478948608_0362_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:27.772 INFO Executor - Finished task 1.0 in stage 85.0 (TID 128). 1729 bytes result sent to driver
18:46:27.772 INFO Executor - Finished task 0.0 in stage 85.0 (TID 127). 1729 bytes result sent to driver
18:46:27.772 INFO TaskSetManager - Finished task 1.0 in stage 85.0 (TID 128) in 29 ms on localhost (executor driver) (1/2)
18:46:27.773 INFO TaskSetManager - Finished task 0.0 in stage 85.0 (TID 127) in 30 ms on localhost (executor driver) (2/2)
18:46:27.773 INFO TaskSchedulerImpl - Removed TaskSet 85.0, whose tasks have all completed, from pool
18:46:27.773 INFO DAGScheduler - ResultStage 85 (runJob at SparkHadoopWriter.scala:83) finished in 0.039 s
18:46:27.773 INFO DAGScheduler - Job 62 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.773 INFO TaskSchedulerImpl - Killing all running tasks in stage 85: Stage finished
18:46:27.774 INFO DAGScheduler - Job 62 finished: runJob at SparkHadoopWriter.scala:83, took 0.112956 s
18:46:27.774 INFO SparkHadoopWriter - Start to commit write Job job_202505191846279021867769478948608_0362.
18:46:27.781 INFO SparkHadoopWriter - Write Job job_202505191846279021867769478948608_0362 committed. Elapsed time: 6 ms.
18:46:27.784 INFO MemoryStore - Block broadcast_156 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
18:46:27.790 INFO MemoryStore - Block broadcast_156_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
18:46:27.790 INFO BlockManagerInfo - Added broadcast_156_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:27.790 INFO SparkContext - Created broadcast 156 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.813 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:27.814 INFO DAGScheduler - Got job 63 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:27.814 INFO DAGScheduler - Final stage: ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222)
18:46:27.814 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 86)
18:46:27.814 INFO DAGScheduler - Missing parents: List()
18:46:27.814 INFO DAGScheduler - Submitting ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:27.815 INFO MemoryStore - Block broadcast_157 stored as values in memory (estimated size 6.3 KiB, free 1916.6 MiB)
18:46:27.815 INFO MemoryStore - Block broadcast_157_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.6 MiB)
18:46:27.815 INFO BlockManagerInfo - Added broadcast_157_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.4 MiB)
18:46:27.815 INFO SparkContext - Created broadcast 157 from broadcast at DAGScheduler.scala:1580
18:46:27.816 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.816 INFO TaskSchedulerImpl - Adding task set 87.0 with 2 tasks resource profile 0
18:46:27.817 INFO TaskSetManager - Starting task 0.0 in stage 87.0 (TID 129) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:27.817 INFO TaskSetManager - Starting task 1.0 in stage 87.0 (TID 130) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:27.817 INFO Executor - Running task 0.0 in stage 87.0 (TID 129)
18:46:27.817 INFO Executor - Running task 1.0 in stage 87.0 (TID 130)
18:46:27.819 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.819 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:27.819 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.819 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:27.823 INFO Executor - Finished task 1.0 in stage 87.0 (TID 130). 1634 bytes result sent to driver
18:46:27.823 INFO TaskSetManager - Finished task 1.0 in stage 87.0 (TID 130) in 6 ms on localhost (executor driver) (1/2)
18:46:27.823 INFO Executor - Finished task 0.0 in stage 87.0 (TID 129). 1634 bytes result sent to driver
18:46:27.824 INFO TaskSetManager - Finished task 0.0 in stage 87.0 (TID 129) in 8 ms on localhost (executor driver) (2/2)
18:46:27.824 INFO TaskSchedulerImpl - Removed TaskSet 87.0, whose tasks have all completed, from pool
18:46:27.824 INFO DAGScheduler - ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
18:46:27.824 INFO DAGScheduler - Job 63 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.824 INFO TaskSchedulerImpl - Killing all running tasks in stage 87: Stage finished
18:46:27.824 INFO DAGScheduler - Job 63 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011052 s
18:46:27.838 INFO FileInputFormat - Total input files to process : 2
18:46:27.842 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:27.842 INFO DAGScheduler - Got job 64 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:27.842 INFO DAGScheduler - Final stage: ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222)
18:46:27.842 INFO DAGScheduler - Parents of final stage: List()
18:46:27.842 INFO DAGScheduler - Missing parents: List()
18:46:27.843 INFO DAGScheduler - Submitting ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:27.874 INFO MemoryStore - Block broadcast_158 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
18:46:27.875 INFO MemoryStore - Block broadcast_158_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.1 MiB)
18:46:27.875 INFO BlockManagerInfo - Added broadcast_158_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.2 MiB)
18:46:27.876 INFO SparkContext - Created broadcast 158 from broadcast at DAGScheduler.scala:1580
18:46:27.876 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:27.876 INFO TaskSchedulerImpl - Adding task set 88.0 with 2 tasks resource profile 0
18:46:27.876 INFO TaskSetManager - Starting task 0.0 in stage 88.0 (TID 131) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
18:46:27.877 INFO TaskSetManager - Starting task 1.0 in stage 88.0 (TID 132) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
18:46:27.877 INFO Executor - Running task 0.0 in stage 88.0 (TID 131)
18:46:27.877 INFO Executor - Running task 1.0 in stage 88.0 (TID 132)
18:46:27.910 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19679429106441831932.bam/part-r-00000.bam:0+132492
18:46:27.919 INFO Executor - Finished task 1.0 in stage 88.0 (TID 132). 989 bytes result sent to driver
18:46:27.920 INFO TaskSetManager - Finished task 1.0 in stage 88.0 (TID 132) in 43 ms on localhost (executor driver) (1/2)
18:46:27.922 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19679429106441831932.bam/part-r-00001.bam:0+129330
18:46:27.935 INFO Executor - Finished task 0.0 in stage 88.0 (TID 131). 989 bytes result sent to driver
18:46:27.935 INFO TaskSetManager - Finished task 0.0 in stage 88.0 (TID 131) in 59 ms on localhost (executor driver) (2/2)
18:46:27.935 INFO TaskSchedulerImpl - Removed TaskSet 88.0, whose tasks have all completed, from pool
18:46:27.936 INFO DAGScheduler - ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.093 s
18:46:27.936 INFO DAGScheduler - Job 64 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:27.936 INFO TaskSchedulerImpl - Killing all running tasks in stage 88: Stage finished
18:46:27.936 INFO DAGScheduler - Job 64 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.094249 s
18:46:27.940 INFO MemoryStore - Block broadcast_159 stored as values in memory (estimated size 297.9 KiB, free 1915.8 MiB)
18:46:27.949 INFO MemoryStore - Block broadcast_159_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.7 MiB)
18:46:27.949 INFO BlockManagerInfo - Added broadcast_159_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:27.949 INFO SparkContext - Created broadcast 159 from newAPIHadoopFile at PathSplitSource.java:96
18:46:27.976 INFO MemoryStore - Block broadcast_160 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
18:46:27.985 INFO MemoryStore - Block broadcast_160_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.4 MiB)
18:46:27.985 INFO BlockManagerInfo - Added broadcast_160_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.1 MiB)
18:46:27.986 INFO SparkContext - Created broadcast 160 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.007 INFO MemoryStore - Block broadcast_161 stored as values in memory (estimated size 160.7 KiB, free 1915.2 MiB)
18:46:28.008 INFO MemoryStore - Block broadcast_161_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.2 MiB)
18:46:28.008 INFO BlockManagerInfo - Added broadcast_161_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.1 MiB)
18:46:28.008 INFO SparkContext - Created broadcast 161 from broadcast at ReadsSparkSink.java:133
18:46:28.010 INFO MemoryStore - Block broadcast_162 stored as values in memory (estimated size 163.2 KiB, free 1915.0 MiB)
18:46:28.017 INFO MemoryStore - Block broadcast_162_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.2 MiB)
18:46:28.017 INFO BlockManagerInfo - Removed broadcast_155_piece0 on localhost:45727 in memory (size: 56.2 KiB, free: 1919.2 MiB)
18:46:28.017 INFO BlockManagerInfo - Added broadcast_162_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.2 MiB)
18:46:28.017 INFO SparkContext - Created broadcast 162 from broadcast at AnySamSinkMultiple.java:80
18:46:28.018 INFO BlockManagerInfo - Removed broadcast_160_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:28.019 INFO BlockManagerInfo - Removed broadcast_141_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:28.020 INFO BlockManagerInfo - Removed broadcast_152_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:28.020 INFO BlockManagerInfo - Removed broadcast_156_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:28.020 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.020 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.020 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.021 INFO BlockManagerInfo - Removed broadcast_157_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.3 MiB)
18:46:28.021 INFO BlockManagerInfo - Removed broadcast_150_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:28.022 INFO BlockManagerInfo - Removed broadcast_154_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.5 MiB)
18:46:28.022 INFO BlockManagerInfo - Removed broadcast_149_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:28.023 INFO BlockManagerInfo - Removed broadcast_147_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:28.023 INFO BlockManagerInfo - Removed broadcast_151_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:28.024 INFO BlockManagerInfo - Removed broadcast_153_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:28.025 INFO BlockManagerInfo - Removed broadcast_158_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.9 MiB)
18:46:28.035 INFO FileInputFormat - Total input files to process : 1
18:46:28.041 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:28.042 INFO DAGScheduler - Registering RDD 377 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 19
18:46:28.042 INFO DAGScheduler - Got job 65 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:28.042 INFO DAGScheduler - Final stage: ResultStage 90 (runJob at SparkHadoopWriter.scala:83)
18:46:28.042 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 89)
18:46:28.042 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 89)
18:46:28.042 INFO DAGScheduler - Submitting ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:28.059 INFO MemoryStore - Block broadcast_163 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
18:46:28.061 INFO MemoryStore - Block broadcast_163_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.8 MiB)
18:46:28.061 INFO BlockManagerInfo - Added broadcast_163_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.8 MiB)
18:46:28.061 INFO SparkContext - Created broadcast 163 from broadcast at DAGScheduler.scala:1580
18:46:28.061 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:28.061 INFO TaskSchedulerImpl - Adding task set 89.0 with 1 tasks resource profile 0
18:46:28.062 INFO TaskSetManager - Starting task 0.0 in stage 89.0 (TID 133) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:28.062 INFO Executor - Running task 0.0 in stage 89.0 (TID 133)
18:46:28.092 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:28.109 INFO Executor - Finished task 0.0 in stage 89.0 (TID 133). 1149 bytes result sent to driver
18:46:28.109 INFO TaskSetManager - Finished task 0.0 in stage 89.0 (TID 133) in 47 ms on localhost (executor driver) (1/1)
18:46:28.109 INFO TaskSchedulerImpl - Removed TaskSet 89.0, whose tasks have all completed, from pool
18:46:28.109 INFO DAGScheduler - ShuffleMapStage 89 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.066 s
18:46:28.110 INFO DAGScheduler - looking for newly runnable stages
18:46:28.110 INFO DAGScheduler - running: HashSet()
18:46:28.110 INFO DAGScheduler - waiting: HashSet(ResultStage 90)
18:46:28.110 INFO DAGScheduler - failed: HashSet()
18:46:28.110 INFO DAGScheduler - Submitting ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:28.116 INFO MemoryStore - Block broadcast_164 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
18:46:28.117 INFO MemoryStore - Block broadcast_164_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1918.6 MiB)
18:46:28.117 INFO BlockManagerInfo - Added broadcast_164_piece0 in memory on localhost:45727 (size: 56.2 KiB, free: 1919.7 MiB)
18:46:28.117 INFO SparkContext - Created broadcast 164 from broadcast at DAGScheduler.scala:1580
18:46:28.117 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.117 INFO TaskSchedulerImpl - Adding task set 90.0 with 2 tasks resource profile 0
18:46:28.118 INFO TaskSetManager - Starting task 0.0 in stage 90.0 (TID 134) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:28.118 INFO TaskSetManager - Starting task 1.0 in stage 90.0 (TID 135) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:28.119 INFO Executor - Running task 1.0 in stage 90.0 (TID 135)
18:46:28.119 INFO Executor - Running task 0.0 in stage 90.0 (TID 134)
18:46:28.123 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.123 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.123 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.124 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.124 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.124 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.125 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.125 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.125 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.126 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.126 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.126 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.137 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.138 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.140 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.140 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.146 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846283570460828396130673_0389_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest113733089559674908292.bam/_temporary/0/task_202505191846283570460828396130673_0389_r_000001
18:46:28.146 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846283570460828396130673_0389_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest113733089559674908292.bam/_temporary/0/task_202505191846283570460828396130673_0389_r_000000
18:46:28.146 INFO SparkHadoopMapRedUtil - attempt_202505191846283570460828396130673_0389_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:28.146 INFO SparkHadoopMapRedUtil - attempt_202505191846283570460828396130673_0389_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:28.147 INFO Executor - Finished task 1.0 in stage 90.0 (TID 135). 1729 bytes result sent to driver
18:46:28.147 INFO Executor - Finished task 0.0 in stage 90.0 (TID 134). 1729 bytes result sent to driver
18:46:28.147 INFO TaskSetManager - Finished task 1.0 in stage 90.0 (TID 135) in 29 ms on localhost (executor driver) (1/2)
18:46:28.147 INFO TaskSetManager - Finished task 0.0 in stage 90.0 (TID 134) in 29 ms on localhost (executor driver) (2/2)
18:46:28.148 INFO TaskSchedulerImpl - Removed TaskSet 90.0, whose tasks have all completed, from pool
18:46:28.148 INFO DAGScheduler - ResultStage 90 (runJob at SparkHadoopWriter.scala:83) finished in 0.038 s
18:46:28.148 INFO DAGScheduler - Job 65 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.148 INFO TaskSchedulerImpl - Killing all running tasks in stage 90: Stage finished
18:46:28.148 INFO DAGScheduler - Job 65 finished: runJob at SparkHadoopWriter.scala:83, took 0.106786 s
18:46:28.148 INFO SparkHadoopWriter - Start to commit write Job job_202505191846283570460828396130673_0389.
18:46:28.154 INFO SparkHadoopWriter - Write Job job_202505191846283570460828396130673_0389 committed. Elapsed time: 5 ms.
18:46:28.156 INFO MemoryStore - Block broadcast_165 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
18:46:28.163 INFO MemoryStore - Block broadcast_165_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
18:46:28.163 INFO BlockManagerInfo - Added broadcast_165_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:28.163 INFO SparkContext - Created broadcast 165 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.188 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:28.189 INFO DAGScheduler - Got job 66 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:28.189 INFO DAGScheduler - Final stage: ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222)
18:46:28.189 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 91)
18:46:28.189 INFO DAGScheduler - Missing parents: List()
18:46:28.189 INFO DAGScheduler - Submitting ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:28.190 INFO MemoryStore - Block broadcast_166 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
18:46:28.190 INFO MemoryStore - Block broadcast_166_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
18:46:28.190 INFO BlockManagerInfo - Added broadcast_166_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.7 MiB)
18:46:28.190 INFO SparkContext - Created broadcast 166 from broadcast at DAGScheduler.scala:1580
18:46:28.191 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.191 INFO TaskSchedulerImpl - Adding task set 92.0 with 2 tasks resource profile 0
18:46:28.191 INFO TaskSetManager - Starting task 0.0 in stage 92.0 (TID 136) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:28.192 INFO TaskSetManager - Starting task 1.0 in stage 92.0 (TID 137) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:28.192 INFO Executor - Running task 0.0 in stage 92.0 (TID 136)
18:46:28.192 INFO Executor - Running task 1.0 in stage 92.0 (TID 137)
18:46:28.194 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.194 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.194 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.194 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.198 INFO Executor - Finished task 1.0 in stage 92.0 (TID 137). 1634 bytes result sent to driver
18:46:28.198 INFO Executor - Finished task 0.0 in stage 92.0 (TID 136). 1634 bytes result sent to driver
18:46:28.199 INFO TaskSetManager - Finished task 1.0 in stage 92.0 (TID 137) in 8 ms on localhost (executor driver) (1/2)
18:46:28.199 INFO TaskSetManager - Finished task 0.0 in stage 92.0 (TID 136) in 8 ms on localhost (executor driver) (2/2)
18:46:28.199 INFO TaskSchedulerImpl - Removed TaskSet 92.0, whose tasks have all completed, from pool
18:46:28.199 INFO DAGScheduler - ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
18:46:28.200 INFO DAGScheduler - Job 66 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.200 INFO TaskSchedulerImpl - Killing all running tasks in stage 92: Stage finished
18:46:28.200 INFO DAGScheduler - Job 66 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011588 s
18:46:28.213 INFO FileInputFormat - Total input files to process : 2
18:46:28.216 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:28.216 INFO DAGScheduler - Got job 67 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:28.216 INFO DAGScheduler - Final stage: ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222)
18:46:28.216 INFO DAGScheduler - Parents of final stage: List()
18:46:28.216 INFO DAGScheduler - Missing parents: List()
18:46:28.217 INFO DAGScheduler - Submitting ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:28.234 INFO MemoryStore - Block broadcast_167 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
18:46:28.235 INFO MemoryStore - Block broadcast_167_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
18:46:28.235 INFO BlockManagerInfo - Added broadcast_167_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:28.235 INFO SparkContext - Created broadcast 167 from broadcast at DAGScheduler.scala:1580
18:46:28.236 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.236 INFO TaskSchedulerImpl - Adding task set 93.0 with 2 tasks resource profile 0
18:46:28.236 INFO TaskSetManager - Starting task 0.0 in stage 93.0 (TID 138) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
18:46:28.236 INFO TaskSetManager - Starting task 1.0 in stage 93.0 (TID 139) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
18:46:28.237 INFO Executor - Running task 0.0 in stage 93.0 (TID 138)
18:46:28.237 INFO Executor - Running task 1.0 in stage 93.0 (TID 139)
18:46:28.266 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest113733089559674908292.bam/part-r-00001.bam:0+129330
18:46:28.276 INFO Executor - Finished task 0.0 in stage 93.0 (TID 138). 989 bytes result sent to driver
18:46:28.277 INFO TaskSetManager - Finished task 0.0 in stage 93.0 (TID 138) in 41 ms on localhost (executor driver) (1/2)
18:46:28.280 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest113733089559674908292.bam/part-r-00000.bam:0+132492
18:46:28.289 INFO Executor - Finished task 1.0 in stage 93.0 (TID 139). 989 bytes result sent to driver
18:46:28.289 INFO TaskSetManager - Finished task 1.0 in stage 93.0 (TID 139) in 53 ms on localhost (executor driver) (2/2)
18:46:28.289 INFO TaskSchedulerImpl - Removed TaskSet 93.0, whose tasks have all completed, from pool
18:46:28.289 INFO DAGScheduler - ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.072 s
18:46:28.290 INFO DAGScheduler - Job 67 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.290 INFO TaskSchedulerImpl - Killing all running tasks in stage 93: Stage finished
18:46:28.290 INFO DAGScheduler - Job 67 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.073694 s
18:46:28.294 INFO MemoryStore - Block broadcast_168 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
18:46:28.304 INFO MemoryStore - Block broadcast_168_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
18:46:28.304 INFO BlockManagerInfo - Added broadcast_168_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:28.305 INFO SparkContext - Created broadcast 168 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.332 INFO MemoryStore - Block broadcast_169 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
18:46:28.338 INFO MemoryStore - Block broadcast_169_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
18:46:28.339 INFO BlockManagerInfo - Added broadcast_169_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:28.339 INFO SparkContext - Created broadcast 169 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.359 INFO MemoryStore - Block broadcast_170 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
18:46:28.360 INFO MemoryStore - Block broadcast_170_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
18:46:28.360 INFO BlockManagerInfo - Added broadcast_170_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.4 MiB)
18:46:28.360 INFO SparkContext - Created broadcast 170 from broadcast at ReadsSparkSink.java:133
18:46:28.361 INFO MemoryStore - Block broadcast_171 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
18:46:28.362 INFO MemoryStore - Block broadcast_171_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
18:46:28.362 INFO BlockManagerInfo - Added broadcast_171_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.4 MiB)
18:46:28.362 INFO SparkContext - Created broadcast 171 from broadcast at AnySamSinkMultiple.java:80
18:46:28.364 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.364 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.364 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.376 INFO FileInputFormat - Total input files to process : 1
18:46:28.382 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:28.383 INFO DAGScheduler - Registering RDD 404 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 20
18:46:28.383 INFO DAGScheduler - Got job 68 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:28.383 INFO DAGScheduler - Final stage: ResultStage 95 (runJob at SparkHadoopWriter.scala:83)
18:46:28.383 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 94)
18:46:28.383 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 94)
18:46:28.383 INFO DAGScheduler - Submitting ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:28.400 INFO MemoryStore - Block broadcast_172 stored as values in memory (estimated size 427.7 KiB, free 1916.2 MiB)
18:46:28.402 INFO MemoryStore - Block broadcast_172_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.1 MiB)
18:46:28.402 INFO BlockManagerInfo - Added broadcast_172_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.3 MiB)
18:46:28.402 INFO SparkContext - Created broadcast 172 from broadcast at DAGScheduler.scala:1580
18:46:28.402 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:28.402 INFO TaskSchedulerImpl - Adding task set 94.0 with 1 tasks resource profile 0
18:46:28.403 INFO TaskSetManager - Starting task 0.0 in stage 94.0 (TID 140) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:28.403 INFO Executor - Running task 0.0 in stage 94.0 (TID 140)
18:46:28.432 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:28.449 INFO Executor - Finished task 0.0 in stage 94.0 (TID 140). 1149 bytes result sent to driver
18:46:28.450 INFO TaskSetManager - Finished task 0.0 in stage 94.0 (TID 140) in 48 ms on localhost (executor driver) (1/1)
18:46:28.450 INFO TaskSchedulerImpl - Removed TaskSet 94.0, whose tasks have all completed, from pool
18:46:28.450 INFO DAGScheduler - ShuffleMapStage 94 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.067 s
18:46:28.450 INFO DAGScheduler - looking for newly runnable stages
18:46:28.450 INFO DAGScheduler - running: HashSet()
18:46:28.450 INFO DAGScheduler - waiting: HashSet(ResultStage 95)
18:46:28.450 INFO DAGScheduler - failed: HashSet()
18:46:28.450 INFO DAGScheduler - Submitting ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:28.456 INFO MemoryStore - Block broadcast_173 stored as values in memory (estimated size 150.2 KiB, free 1915.9 MiB)
18:46:28.457 INFO MemoryStore - Block broadcast_173_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1915.9 MiB)
18:46:28.457 INFO BlockManagerInfo - Added broadcast_173_piece0 in memory on localhost:45727 (size: 56.3 KiB, free: 1919.2 MiB)
18:46:28.458 INFO SparkContext - Created broadcast 173 from broadcast at DAGScheduler.scala:1580
18:46:28.458 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.458 INFO TaskSchedulerImpl - Adding task set 95.0 with 2 tasks resource profile 0
18:46:28.458 INFO TaskSetManager - Starting task 0.0 in stage 95.0 (TID 141) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:28.458 INFO TaskSetManager - Starting task 1.0 in stage 95.0 (TID 142) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:28.459 INFO Executor - Running task 0.0 in stage 95.0 (TID 141)
18:46:28.459 INFO Executor - Running task 1.0 in stage 95.0 (TID 142)
18:46:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.475 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.475 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.479 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.479 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.483 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846285953676140842009545_0416_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest115064329093969457337.bam/_temporary/0/task_202505191846285953676140842009545_0416_r_000000
18:46:28.483 INFO SparkHadoopMapRedUtil - attempt_202505191846285953676140842009545_0416_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:28.484 INFO Executor - Finished task 0.0 in stage 95.0 (TID 141). 1729 bytes result sent to driver
18:46:28.484 INFO TaskSetManager - Finished task 0.0 in stage 95.0 (TID 141) in 26 ms on localhost (executor driver) (1/2)
18:46:28.487 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846285953676140842009545_0416_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest115064329093969457337.bam/_temporary/0/task_202505191846285953676140842009545_0416_r_000001
18:46:28.487 INFO SparkHadoopMapRedUtil - attempt_202505191846285953676140842009545_0416_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:28.488 INFO Executor - Finished task 1.0 in stage 95.0 (TID 142). 1729 bytes result sent to driver
18:46:28.488 INFO TaskSetManager - Finished task 1.0 in stage 95.0 (TID 142) in 30 ms on localhost (executor driver) (2/2)
18:46:28.488 INFO TaskSchedulerImpl - Removed TaskSet 95.0, whose tasks have all completed, from pool
18:46:28.488 INFO DAGScheduler - ResultStage 95 (runJob at SparkHadoopWriter.scala:83) finished in 0.037 s
18:46:28.488 INFO DAGScheduler - Job 68 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.488 INFO TaskSchedulerImpl - Killing all running tasks in stage 95: Stage finished
18:46:28.489 INFO DAGScheduler - Job 68 finished: runJob at SparkHadoopWriter.scala:83, took 0.106395 s
18:46:28.489 INFO SparkHadoopWriter - Start to commit write Job job_202505191846285953676140842009545_0416.
18:46:28.494 INFO SparkHadoopWriter - Write Job job_202505191846285953676140842009545_0416 committed. Elapsed time: 5 ms.
18:46:28.497 INFO MemoryStore - Block broadcast_174 stored as values in memory (estimated size 297.9 KiB, free 1915.6 MiB)
18:46:28.508 INFO MemoryStore - Block broadcast_174_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.5 MiB)
18:46:28.508 INFO BlockManagerInfo - Added broadcast_174_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:28.508 INFO SparkContext - Created broadcast 174 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.531 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:28.531 INFO DAGScheduler - Got job 69 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:28.531 INFO DAGScheduler - Final stage: ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222)
18:46:28.531 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 96)
18:46:28.531 INFO DAGScheduler - Missing parents: List()
18:46:28.532 INFO DAGScheduler - Submitting ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:28.532 INFO MemoryStore - Block broadcast_175 stored as values in memory (estimated size 6.3 KiB, free 1915.5 MiB)
18:46:28.539 INFO MemoryStore - Block broadcast_175_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1915.5 MiB)
18:46:28.539 INFO BlockManagerInfo - Added broadcast_175_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.1 MiB)
18:46:28.539 INFO SparkContext - Created broadcast 175 from broadcast at DAGScheduler.scala:1580
18:46:28.539 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.539 INFO TaskSchedulerImpl - Adding task set 97.0 with 2 tasks resource profile 0
18:46:28.540 INFO BlockManagerInfo - Removed broadcast_162_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.2 MiB)
18:46:28.540 INFO TaskSetManager - Starting task 0.0 in stage 97.0 (TID 143) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:28.540 INFO TaskSetManager - Starting task 1.0 in stage 97.0 (TID 144) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:28.541 INFO Executor - Running task 0.0 in stage 97.0 (TID 143)
18:46:28.541 INFO BlockManagerInfo - Removed broadcast_165_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:28.541 INFO Executor - Running task 1.0 in stage 97.0 (TID 144)
18:46:28.543 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.543 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.543 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.543 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.543 INFO BlockManagerInfo - Removed broadcast_167_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:28.544 INFO BlockManagerInfo - Removed broadcast_164_piece0 on localhost:45727 in memory (size: 56.2 KiB, free: 1919.4 MiB)
18:46:28.545 INFO BlockManagerInfo - Removed broadcast_171_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.4 MiB)
18:46:28.547 INFO BlockManagerInfo - Removed broadcast_161_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.4 MiB)
18:46:28.548 INFO Executor - Finished task 0.0 in stage 97.0 (TID 143). 1634 bytes result sent to driver
18:46:28.548 INFO Executor - Finished task 1.0 in stage 97.0 (TID 144). 1591 bytes result sent to driver
18:46:28.548 INFO TaskSetManager - Finished task 0.0 in stage 97.0 (TID 143) in 8 ms on localhost (executor driver) (1/2)
18:46:28.548 INFO BlockManagerInfo - Removed broadcast_170_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.4 MiB)
18:46:28.549 INFO BlockManagerInfo - Removed broadcast_163_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.6 MiB)
18:46:28.549 INFO TaskSetManager - Finished task 1.0 in stage 97.0 (TID 144) in 9 ms on localhost (executor driver) (2/2)
18:46:28.549 INFO TaskSchedulerImpl - Removed TaskSet 97.0, whose tasks have all completed, from pool
18:46:28.549 INFO DAGScheduler - ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.017 s
18:46:28.549 INFO DAGScheduler - Job 69 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.549 INFO TaskSchedulerImpl - Killing all running tasks in stage 97: Stage finished
18:46:28.549 INFO DAGScheduler - Job 69 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.018177 s
18:46:28.549 INFO BlockManagerInfo - Removed broadcast_166_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.6 MiB)
18:46:28.551 INFO BlockManagerInfo - Removed broadcast_169_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:28.551 INFO BlockManagerInfo - Removed broadcast_173_piece0 on localhost:45727 in memory (size: 56.3 KiB, free: 1919.7 MiB)
18:46:28.552 INFO BlockManagerInfo - Removed broadcast_159_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:28.552 INFO BlockManagerInfo - Removed broadcast_172_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.9 MiB)
18:46:28.563 INFO FileInputFormat - Total input files to process : 2
18:46:28.568 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:28.568 INFO DAGScheduler - Got job 70 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:28.568 INFO DAGScheduler - Final stage: ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222)
18:46:28.568 INFO DAGScheduler - Parents of final stage: List()
18:46:28.568 INFO DAGScheduler - Missing parents: List()
18:46:28.568 INFO DAGScheduler - Submitting ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:28.585 INFO MemoryStore - Block broadcast_176 stored as values in memory (estimated size 426.1 KiB, free 1918.9 MiB)
18:46:28.587 INFO MemoryStore - Block broadcast_176_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.7 MiB)
18:46:28.587 INFO BlockManagerInfo - Added broadcast_176_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.7 MiB)
18:46:28.587 INFO SparkContext - Created broadcast 176 from broadcast at DAGScheduler.scala:1580
18:46:28.587 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.587 INFO TaskSchedulerImpl - Adding task set 98.0 with 2 tasks resource profile 0
18:46:28.588 INFO TaskSetManager - Starting task 0.0 in stage 98.0 (TID 145) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
18:46:28.588 INFO TaskSetManager - Starting task 1.0 in stage 98.0 (TID 146) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
18:46:28.588 INFO Executor - Running task 0.0 in stage 98.0 (TID 145)
18:46:28.588 INFO Executor - Running task 1.0 in stage 98.0 (TID 146)
18:46:28.619 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115064329093969457337.bam/part-r-00001.bam:0+129330
18:46:28.629 INFO Executor - Finished task 0.0 in stage 98.0 (TID 145). 989 bytes result sent to driver
18:46:28.630 INFO TaskSetManager - Finished task 0.0 in stage 98.0 (TID 145) in 42 ms on localhost (executor driver) (1/2)
18:46:28.634 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115064329093969457337.bam/part-r-00000.bam:0+132492
18:46:28.645 INFO Executor - Finished task 1.0 in stage 98.0 (TID 146). 989 bytes result sent to driver
18:46:28.646 INFO TaskSetManager - Finished task 1.0 in stage 98.0 (TID 146) in 58 ms on localhost (executor driver) (2/2)
18:46:28.646 INFO TaskSchedulerImpl - Removed TaskSet 98.0, whose tasks have all completed, from pool
18:46:28.646 INFO DAGScheduler - ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.077 s
18:46:28.646 INFO DAGScheduler - Job 70 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.646 INFO TaskSchedulerImpl - Killing all running tasks in stage 98: Stage finished
18:46:28.646 INFO DAGScheduler - Job 70 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.078379 s
18:46:28.649 INFO MemoryStore - Block broadcast_177 stored as values in memory (estimated size 298.0 KiB, free 1918.5 MiB)
18:46:28.655 INFO MemoryStore - Block broadcast_177_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.4 MiB)
18:46:28.656 INFO BlockManagerInfo - Added broadcast_177_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.7 MiB)
18:46:28.656 INFO SparkContext - Created broadcast 177 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.679 INFO MemoryStore - Block broadcast_178 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
18:46:28.685 INFO MemoryStore - Block broadcast_178_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.1 MiB)
18:46:28.686 INFO BlockManagerInfo - Added broadcast_178_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.7 MiB)
18:46:28.686 INFO SparkContext - Created broadcast 178 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.706 INFO MemoryStore - Block broadcast_179 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
18:46:28.707 INFO MemoryStore - Block broadcast_179_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
18:46:28.707 INFO BlockManagerInfo - Added broadcast_179_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:28.707 INFO SparkContext - Created broadcast 179 from broadcast at ReadsSparkSink.java:133
18:46:28.708 INFO MemoryStore - Block broadcast_180 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
18:46:28.709 INFO MemoryStore - Block broadcast_180_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
18:46:28.709 INFO BlockManagerInfo - Added broadcast_180_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:28.709 INFO SparkContext - Created broadcast 180 from broadcast at AnySamSinkMultiple.java:80
18:46:28.711 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.711 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.711 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.723 INFO FileInputFormat - Total input files to process : 1
18:46:28.729 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:28.730 INFO DAGScheduler - Registering RDD 431 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 21
18:46:28.730 INFO DAGScheduler - Got job 71 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:28.730 INFO DAGScheduler - Final stage: ResultStage 100 (runJob at SparkHadoopWriter.scala:83)
18:46:28.730 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 99)
18:46:28.730 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 99)
18:46:28.730 INFO DAGScheduler - Submitting ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:28.747 INFO MemoryStore - Block broadcast_181 stored as values in memory (estimated size 427.7 KiB, free 1917.3 MiB)
18:46:28.749 INFO MemoryStore - Block broadcast_181_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.2 MiB)
18:46:28.749 INFO BlockManagerInfo - Added broadcast_181_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.5 MiB)
18:46:28.749 INFO SparkContext - Created broadcast 181 from broadcast at DAGScheduler.scala:1580
18:46:28.749 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:28.750 INFO TaskSchedulerImpl - Adding task set 99.0 with 1 tasks resource profile 0
18:46:28.750 INFO TaskSetManager - Starting task 0.0 in stage 99.0 (TID 147) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
18:46:28.751 INFO Executor - Running task 0.0 in stage 99.0 (TID 147)
18:46:28.782 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:28.800 INFO Executor - Finished task 0.0 in stage 99.0 (TID 147). 1149 bytes result sent to driver
18:46:28.801 INFO TaskSetManager - Finished task 0.0 in stage 99.0 (TID 147) in 51 ms on localhost (executor driver) (1/1)
18:46:28.801 INFO TaskSchedulerImpl - Removed TaskSet 99.0, whose tasks have all completed, from pool
18:46:28.801 INFO DAGScheduler - ShuffleMapStage 99 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.070 s
18:46:28.801 INFO DAGScheduler - looking for newly runnable stages
18:46:28.801 INFO DAGScheduler - running: HashSet()
18:46:28.801 INFO DAGScheduler - waiting: HashSet(ResultStage 100)
18:46:28.801 INFO DAGScheduler - failed: HashSet()
18:46:28.801 INFO DAGScheduler - Submitting ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:28.810 INFO MemoryStore - Block broadcast_182 stored as values in memory (estimated size 150.2 KiB, free 1917.0 MiB)
18:46:28.810 INFO MemoryStore - Block broadcast_182_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1917.0 MiB)
18:46:28.811 INFO BlockManagerInfo - Added broadcast_182_piece0 in memory on localhost:45727 (size: 56.3 KiB, free: 1919.4 MiB)
18:46:28.811 INFO SparkContext - Created broadcast 182 from broadcast at DAGScheduler.scala:1580
18:46:28.811 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.811 INFO TaskSchedulerImpl - Adding task set 100.0 with 2 tasks resource profile 0
18:46:28.812 INFO TaskSetManager - Starting task 0.0 in stage 100.0 (TID 148) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:28.812 INFO TaskSetManager - Starting task 1.0 in stage 100.0 (TID 149) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:28.812 INFO Executor - Running task 1.0 in stage 100.0 (TID 149)
18:46:28.812 INFO Executor - Running task 0.0 in stage 100.0 (TID 148)
18:46:28.816 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.816 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.816 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.816 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.817 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.817 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.817 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.817 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.817 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.818 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:28.818 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:28.818 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:28.827 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.827 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.830 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.830 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.834 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846285229836950451492755_0443_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest27190953758514679356.bam/_temporary/0/task_202505191846285229836950451492755_0443_r_000000
18:46:28.834 INFO SparkHadoopMapRedUtil - attempt_202505191846285229836950451492755_0443_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:28.834 INFO Executor - Finished task 0.0 in stage 100.0 (TID 148). 1729 bytes result sent to driver
18:46:28.835 INFO TaskSetManager - Finished task 0.0 in stage 100.0 (TID 148) in 23 ms on localhost (executor driver) (1/2)
18:46:28.837 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846285229836950451492755_0443_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest27190953758514679356.bam/_temporary/0/task_202505191846285229836950451492755_0443_r_000001
18:46:28.837 INFO SparkHadoopMapRedUtil - attempt_202505191846285229836950451492755_0443_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:28.837 INFO Executor - Finished task 1.0 in stage 100.0 (TID 149). 1729 bytes result sent to driver
18:46:28.837 INFO TaskSetManager - Finished task 1.0 in stage 100.0 (TID 149) in 25 ms on localhost (executor driver) (2/2)
18:46:28.837 INFO TaskSchedulerImpl - Removed TaskSet 100.0, whose tasks have all completed, from pool
18:46:28.837 INFO DAGScheduler - ResultStage 100 (runJob at SparkHadoopWriter.scala:83) finished in 0.035 s
18:46:28.838 INFO DAGScheduler - Job 71 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.838 INFO TaskSchedulerImpl - Killing all running tasks in stage 100: Stage finished
18:46:28.838 INFO DAGScheduler - Job 71 finished: runJob at SparkHadoopWriter.scala:83, took 0.108367 s
18:46:28.838 INFO SparkHadoopWriter - Start to commit write Job job_202505191846285229836950451492755_0443.
18:46:28.843 INFO SparkHadoopWriter - Write Job job_202505191846285229836950451492755_0443 committed. Elapsed time: 4 ms.
18:46:28.845 INFO MemoryStore - Block broadcast_183 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
18:46:28.852 INFO MemoryStore - Block broadcast_183_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
18:46:28.852 INFO BlockManagerInfo - Added broadcast_183_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:28.852 INFO SparkContext - Created broadcast 183 from newAPIHadoopFile at PathSplitSource.java:96
18:46:28.875 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:28.876 INFO DAGScheduler - Got job 72 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:28.876 INFO DAGScheduler - Final stage: ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222)
18:46:28.876 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 101)
18:46:28.876 INFO DAGScheduler - Missing parents: List()
18:46:28.876 INFO DAGScheduler - Submitting ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:28.877 INFO MemoryStore - Block broadcast_184 stored as values in memory (estimated size 6.3 KiB, free 1916.6 MiB)
18:46:28.877 INFO MemoryStore - Block broadcast_184_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.6 MiB)
18:46:28.877 INFO BlockManagerInfo - Added broadcast_184_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.4 MiB)
18:46:28.877 INFO SparkContext - Created broadcast 184 from broadcast at DAGScheduler.scala:1580
18:46:28.877 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.877 INFO TaskSchedulerImpl - Adding task set 102.0 with 2 tasks resource profile 0
18:46:28.878 INFO TaskSetManager - Starting task 0.0 in stage 102.0 (TID 150) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:28.878 INFO TaskSetManager - Starting task 1.0 in stage 102.0 (TID 151) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:28.878 INFO Executor - Running task 1.0 in stage 102.0 (TID 151)
18:46:28.878 INFO Executor - Running task 0.0 in stage 102.0 (TID 150)
18:46:28.880 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.880 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:28.880 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.880 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:28.884 INFO Executor - Finished task 1.0 in stage 102.0 (TID 151). 1591 bytes result sent to driver
18:46:28.884 INFO Executor - Finished task 0.0 in stage 102.0 (TID 150). 1591 bytes result sent to driver
18:46:28.885 INFO TaskSetManager - Finished task 1.0 in stage 102.0 (TID 151) in 7 ms on localhost (executor driver) (1/2)
18:46:28.885 INFO TaskSetManager - Finished task 0.0 in stage 102.0 (TID 150) in 7 ms on localhost (executor driver) (2/2)
18:46:28.885 INFO TaskSchedulerImpl - Removed TaskSet 102.0, whose tasks have all completed, from pool
18:46:28.885 INFO DAGScheduler - ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
18:46:28.885 INFO DAGScheduler - Job 72 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.885 INFO TaskSchedulerImpl - Killing all running tasks in stage 102: Stage finished
18:46:28.885 INFO DAGScheduler - Job 72 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010142 s
18:46:28.898 INFO FileInputFormat - Total input files to process : 2
18:46:28.901 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:28.902 INFO DAGScheduler - Got job 73 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:28.902 INFO DAGScheduler - Final stage: ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222)
18:46:28.902 INFO DAGScheduler - Parents of final stage: List()
18:46:28.902 INFO DAGScheduler - Missing parents: List()
18:46:28.902 INFO DAGScheduler - Submitting ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:28.919 INFO MemoryStore - Block broadcast_185 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
18:46:28.920 INFO MemoryStore - Block broadcast_185_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.0 MiB)
18:46:28.920 INFO BlockManagerInfo - Added broadcast_185_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.2 MiB)
18:46:28.920 INFO SparkContext - Created broadcast 185 from broadcast at DAGScheduler.scala:1580
18:46:28.921 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:28.921 INFO TaskSchedulerImpl - Adding task set 103.0 with 2 tasks resource profile 0
18:46:28.921 INFO TaskSetManager - Starting task 0.0 in stage 103.0 (TID 152) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
18:46:28.921 INFO TaskSetManager - Starting task 1.0 in stage 103.0 (TID 153) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
18:46:28.922 INFO Executor - Running task 0.0 in stage 103.0 (TID 152)
18:46:28.922 INFO Executor - Running task 1.0 in stage 103.0 (TID 153)
18:46:28.957 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest27190953758514679356.bam/part-r-00000.bam:0+129755
18:46:28.957 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest27190953758514679356.bam/part-r-00001.bam:0+129440
18:46:28.967 INFO Executor - Finished task 0.0 in stage 103.0 (TID 152). 989 bytes result sent to driver
18:46:28.967 INFO TaskSetManager - Finished task 0.0 in stage 103.0 (TID 152) in 46 ms on localhost (executor driver) (1/2)
18:46:28.968 INFO Executor - Finished task 1.0 in stage 103.0 (TID 153). 989 bytes result sent to driver
18:46:28.968 INFO TaskSetManager - Finished task 1.0 in stage 103.0 (TID 153) in 47 ms on localhost (executor driver) (2/2)
18:46:28.968 INFO TaskSchedulerImpl - Removed TaskSet 103.0, whose tasks have all completed, from pool
18:46:28.968 INFO DAGScheduler - ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.066 s
18:46:28.968 INFO DAGScheduler - Job 73 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:28.968 INFO TaskSchedulerImpl - Killing all running tasks in stage 103: Stage finished
18:46:28.968 INFO DAGScheduler - Job 73 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.067033 s
18:46:28.971 INFO MemoryStore - Block broadcast_186 stored as values in memory (estimated size 298.0 KiB, free 1915.8 MiB)
18:46:28.977 INFO MemoryStore - Block broadcast_186_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.7 MiB)
18:46:28.977 INFO BlockManagerInfo - Added broadcast_186_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:28.977 INFO SparkContext - Created broadcast 186 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.000 INFO MemoryStore - Block broadcast_187 stored as values in memory (estimated size 298.0 KiB, free 1915.4 MiB)
18:46:29.009 INFO BlockManagerInfo - Removed broadcast_183_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:29.010 INFO BlockManagerInfo - Removed broadcast_176_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:29.010 INFO BlockManagerInfo - Removed broadcast_168_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:29.011 INFO BlockManagerInfo - Removed broadcast_185_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.6 MiB)
18:46:29.012 INFO BlockManagerInfo - Removed broadcast_184_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.6 MiB)
18:46:29.013 INFO BlockManagerInfo - Removed broadcast_181_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.7 MiB)
18:46:29.014 INFO BlockManagerInfo - Removed broadcast_174_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:29.014 INFO BlockManagerInfo - Removed broadcast_177_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.8 MiB)
18:46:29.015 INFO BlockManagerInfo - Removed broadcast_175_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.8 MiB)
18:46:29.015 INFO BlockManagerInfo - Removed broadcast_179_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:29.016 INFO MemoryStore - Block broadcast_187_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.6 MiB)
18:46:29.016 INFO BlockManagerInfo - Removed broadcast_180_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:29.016 INFO BlockManagerInfo - Added broadcast_187_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:29.016 INFO SparkContext - Created broadcast 187 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.018 INFO BlockManagerInfo - Removed broadcast_182_piece0 on localhost:45727 in memory (size: 56.3 KiB, free: 1919.9 MiB)
18:46:29.019 INFO BlockManagerInfo - Removed broadcast_178_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.9 MiB)
18:46:29.046 INFO MemoryStore - Block broadcast_188 stored as values in memory (estimated size 19.6 KiB, free 1919.3 MiB)
18:46:29.046 INFO MemoryStore - Block broadcast_188_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1919.3 MiB)
18:46:29.046 INFO BlockManagerInfo - Added broadcast_188_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.9 MiB)
18:46:29.047 INFO SparkContext - Created broadcast 188 from broadcast at ReadsSparkSink.java:133
18:46:29.048 INFO MemoryStore - Block broadcast_189 stored as values in memory (estimated size 20.0 KiB, free 1919.3 MiB)
18:46:29.048 INFO MemoryStore - Block broadcast_189_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1919.3 MiB)
18:46:29.048 INFO BlockManagerInfo - Added broadcast_189_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.9 MiB)
18:46:29.049 INFO SparkContext - Created broadcast 189 from broadcast at AnySamSinkMultiple.java:80
18:46:29.051 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.051 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.051 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.063 INFO FileInputFormat - Total input files to process : 1
18:46:29.070 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:29.070 INFO DAGScheduler - Registering RDD 458 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 22
18:46:29.070 INFO DAGScheduler - Got job 74 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:29.070 INFO DAGScheduler - Final stage: ResultStage 105 (runJob at SparkHadoopWriter.scala:83)
18:46:29.071 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 104)
18:46:29.071 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 104)
18:46:29.071 INFO DAGScheduler - Submitting ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:29.088 INFO MemoryStore - Block broadcast_190 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
18:46:29.089 INFO MemoryStore - Block broadcast_190_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.7 MiB)
18:46:29.090 INFO BlockManagerInfo - Added broadcast_190_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.7 MiB)
18:46:29.090 INFO SparkContext - Created broadcast 190 from broadcast at DAGScheduler.scala:1580
18:46:29.090 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:29.090 INFO TaskSchedulerImpl - Adding task set 104.0 with 1 tasks resource profile 0
18:46:29.091 INFO TaskSetManager - Starting task 0.0 in stage 104.0 (TID 154) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
18:46:29.091 INFO Executor - Running task 0.0 in stage 104.0 (TID 154)
18:46:29.125 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:29.139 INFO Executor - Finished task 0.0 in stage 104.0 (TID 154). 1149 bytes result sent to driver
18:46:29.140 INFO TaskSetManager - Finished task 0.0 in stage 104.0 (TID 154) in 49 ms on localhost (executor driver) (1/1)
18:46:29.140 INFO TaskSchedulerImpl - Removed TaskSet 104.0, whose tasks have all completed, from pool
18:46:29.140 INFO DAGScheduler - ShuffleMapStage 104 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.069 s
18:46:29.140 INFO DAGScheduler - looking for newly runnable stages
18:46:29.140 INFO DAGScheduler - running: HashSet()
18:46:29.140 INFO DAGScheduler - waiting: HashSet(ResultStage 105)
18:46:29.140 INFO DAGScheduler - failed: HashSet()
18:46:29.140 INFO DAGScheduler - Submitting ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:29.151 INFO MemoryStore - Block broadcast_191 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
18:46:29.152 INFO MemoryStore - Block broadcast_191_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1918.5 MiB)
18:46:29.152 INFO BlockManagerInfo - Added broadcast_191_piece0 in memory on localhost:45727 (size: 56.3 KiB, free: 1919.7 MiB)
18:46:29.152 INFO SparkContext - Created broadcast 191 from broadcast at DAGScheduler.scala:1580
18:46:29.153 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.153 INFO TaskSchedulerImpl - Adding task set 105.0 with 2 tasks resource profile 0
18:46:29.153 INFO TaskSetManager - Starting task 0.0 in stage 105.0 (TID 155) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:29.153 INFO TaskSetManager - Starting task 1.0 in stage 105.0 (TID 156) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:29.154 INFO Executor - Running task 0.0 in stage 105.0 (TID 155)
18:46:29.154 INFO Executor - Running task 1.0 in stage 105.0 (TID 156)
18:46:29.158 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.158 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.158 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.158 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.158 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.158 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.160 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.160 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.160 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.160 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.160 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.160 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.171 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.171 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.172 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.172 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.177 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846297014791534066831759_0470_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest315645131052240178344.bam/_temporary/0/task_202505191846297014791534066831759_0470_r_000001
18:46:29.177 INFO SparkHadoopMapRedUtil - attempt_202505191846297014791534066831759_0470_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:29.179 INFO Executor - Finished task 1.0 in stage 105.0 (TID 156). 1729 bytes result sent to driver
18:46:29.179 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846297014791534066831759_0470_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest315645131052240178344.bam/_temporary/0/task_202505191846297014791534066831759_0470_r_000000
18:46:29.179 INFO TaskSetManager - Finished task 1.0 in stage 105.0 (TID 156) in 26 ms on localhost (executor driver) (1/2)
18:46:29.179 INFO SparkHadoopMapRedUtil - attempt_202505191846297014791534066831759_0470_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:29.180 INFO Executor - Finished task 0.0 in stage 105.0 (TID 155). 1729 bytes result sent to driver
18:46:29.180 INFO TaskSetManager - Finished task 0.0 in stage 105.0 (TID 155) in 27 ms on localhost (executor driver) (2/2)
18:46:29.180 INFO TaskSchedulerImpl - Removed TaskSet 105.0, whose tasks have all completed, from pool
18:46:29.180 INFO DAGScheduler - ResultStage 105 (runJob at SparkHadoopWriter.scala:83) finished in 0.039 s
18:46:29.180 INFO DAGScheduler - Job 74 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.180 INFO TaskSchedulerImpl - Killing all running tasks in stage 105: Stage finished
18:46:29.180 INFO DAGScheduler - Job 74 finished: runJob at SparkHadoopWriter.scala:83, took 0.110488 s
18:46:29.181 INFO SparkHadoopWriter - Start to commit write Job job_202505191846297014791534066831759_0470.
18:46:29.187 INFO SparkHadoopWriter - Write Job job_202505191846297014791534066831759_0470 committed. Elapsed time: 5 ms.
18:46:29.189 INFO MemoryStore - Block broadcast_192 stored as values in memory (estimated size 297.9 KiB, free 1918.2 MiB)
18:46:29.195 INFO MemoryStore - Block broadcast_192_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
18:46:29.196 INFO BlockManagerInfo - Added broadcast_192_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:29.196 INFO SparkContext - Created broadcast 192 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.219 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:29.219 INFO DAGScheduler - Got job 75 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:29.219 INFO DAGScheduler - Final stage: ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222)
18:46:29.219 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 106)
18:46:29.219 INFO DAGScheduler - Missing parents: List()
18:46:29.219 INFO DAGScheduler - Submitting ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:29.220 INFO MemoryStore - Block broadcast_193 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
18:46:29.221 INFO MemoryStore - Block broadcast_193_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
18:46:29.221 INFO BlockManagerInfo - Added broadcast_193_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.6 MiB)
18:46:29.221 INFO SparkContext - Created broadcast 193 from broadcast at DAGScheduler.scala:1580
18:46:29.221 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.221 INFO TaskSchedulerImpl - Adding task set 107.0 with 2 tasks resource profile 0
18:46:29.222 INFO TaskSetManager - Starting task 0.0 in stage 107.0 (TID 157) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:29.222 INFO TaskSetManager - Starting task 1.0 in stage 107.0 (TID 158) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:29.222 INFO Executor - Running task 1.0 in stage 107.0 (TID 158)
18:46:29.222 INFO Executor - Running task 0.0 in stage 107.0 (TID 157)
18:46:29.224 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.224 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.224 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.224 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.228 INFO Executor - Finished task 0.0 in stage 107.0 (TID 157). 1591 bytes result sent to driver
18:46:29.228 INFO Executor - Finished task 1.0 in stage 107.0 (TID 158). 1591 bytes result sent to driver
18:46:29.228 INFO TaskSetManager - Finished task 1.0 in stage 107.0 (TID 158) in 6 ms on localhost (executor driver) (1/2)
18:46:29.228 INFO TaskSetManager - Finished task 0.0 in stage 107.0 (TID 157) in 6 ms on localhost (executor driver) (2/2)
18:46:29.228 INFO TaskSchedulerImpl - Removed TaskSet 107.0, whose tasks have all completed, from pool
18:46:29.229 INFO DAGScheduler - ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
18:46:29.229 INFO DAGScheduler - Job 75 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.229 INFO TaskSchedulerImpl - Killing all running tasks in stage 107: Stage finished
18:46:29.229 INFO DAGScheduler - Job 75 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010188 s
18:46:29.241 INFO FileInputFormat - Total input files to process : 2
18:46:29.244 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:29.245 INFO DAGScheduler - Got job 76 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:29.245 INFO DAGScheduler - Final stage: ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222)
18:46:29.245 INFO DAGScheduler - Parents of final stage: List()
18:46:29.245 INFO DAGScheduler - Missing parents: List()
18:46:29.245 INFO DAGScheduler - Submitting ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:29.265 INFO MemoryStore - Block broadcast_194 stored as values in memory (estimated size 426.1 KiB, free 1917.7 MiB)
18:46:29.267 INFO MemoryStore - Block broadcast_194_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
18:46:29.267 INFO BlockManagerInfo - Added broadcast_194_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:29.267 INFO SparkContext - Created broadcast 194 from broadcast at DAGScheduler.scala:1580
18:46:29.267 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.267 INFO TaskSchedulerImpl - Adding task set 108.0 with 2 tasks resource profile 0
18:46:29.268 INFO TaskSetManager - Starting task 0.0 in stage 108.0 (TID 159) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
18:46:29.268 INFO TaskSetManager - Starting task 1.0 in stage 108.0 (TID 160) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
18:46:29.268 INFO Executor - Running task 0.0 in stage 108.0 (TID 159)
18:46:29.268 INFO Executor - Running task 1.0 in stage 108.0 (TID 160)
18:46:29.299 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest315645131052240178344.bam/part-r-00001.bam:0+123314
18:46:29.303 INFO Executor - Finished task 0.0 in stage 108.0 (TID 159). 989 bytes result sent to driver
18:46:29.303 INFO TaskSetManager - Finished task 0.0 in stage 108.0 (TID 159) in 35 ms on localhost (executor driver) (1/2)
18:46:29.313 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest315645131052240178344.bam/part-r-00000.bam:0+122169
18:46:29.317 INFO Executor - Finished task 1.0 in stage 108.0 (TID 160). 989 bytes result sent to driver
18:46:29.317 INFO TaskSetManager - Finished task 1.0 in stage 108.0 (TID 160) in 49 ms on localhost (executor driver) (2/2)
18:46:29.317 INFO TaskSchedulerImpl - Removed TaskSet 108.0, whose tasks have all completed, from pool
18:46:29.317 INFO DAGScheduler - ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.072 s
18:46:29.317 INFO DAGScheduler - Job 76 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.317 INFO TaskSchedulerImpl - Killing all running tasks in stage 108: Stage finished
18:46:29.317 INFO DAGScheduler - Job 76 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.072913 s
18:46:29.320 INFO MemoryStore - Block broadcast_195 stored as values in memory (estimated size 576.0 B, free 1917.6 MiB)
18:46:29.320 INFO MemoryStore - Block broadcast_195_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.6 MiB)
18:46:29.320 INFO BlockManagerInfo - Added broadcast_195_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.5 MiB)
18:46:29.321 INFO SparkContext - Created broadcast 195 from broadcast at CramSource.java:114
18:46:29.321 INFO MemoryStore - Block broadcast_196 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
18:46:29.327 INFO MemoryStore - Block broadcast_196_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
18:46:29.328 INFO BlockManagerInfo - Added broadcast_196_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:29.328 INFO SparkContext - Created broadcast 196 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.344 INFO MemoryStore - Block broadcast_197 stored as values in memory (estimated size 576.0 B, free 1917.3 MiB)
18:46:29.345 INFO MemoryStore - Block broadcast_197_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.3 MiB)
18:46:29.345 INFO BlockManagerInfo - Added broadcast_197_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.4 MiB)
18:46:29.345 INFO SparkContext - Created broadcast 197 from broadcast at CramSource.java:114
18:46:29.346 INFO MemoryStore - Block broadcast_198 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
18:46:29.352 INFO MemoryStore - Block broadcast_198_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
18:46:29.352 INFO BlockManagerInfo - Added broadcast_198_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:29.352 INFO SparkContext - Created broadcast 198 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.366 INFO MemoryStore - Block broadcast_199 stored as values in memory (estimated size 6.0 KiB, free 1916.9 MiB)
18:46:29.366 INFO MemoryStore - Block broadcast_199_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.9 MiB)
18:46:29.366 INFO BlockManagerInfo - Added broadcast_199_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.4 MiB)
18:46:29.366 INFO SparkContext - Created broadcast 199 from broadcast at ReadsSparkSink.java:133
18:46:29.367 INFO MemoryStore - Block broadcast_200 stored as values in memory (estimated size 6.2 KiB, free 1916.9 MiB)
18:46:29.367 INFO MemoryStore - Block broadcast_200_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.9 MiB)
18:46:29.368 INFO BlockManagerInfo - Added broadcast_200_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.4 MiB)
18:46:29.368 INFO SparkContext - Created broadcast 200 from broadcast at AnySamSinkMultiple.java:80
18:46:29.369 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.370 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.370 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.381 INFO FileInputFormat - Total input files to process : 1
18:46:29.387 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:29.387 INFO DAGScheduler - Registering RDD 484 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 23
18:46:29.387 INFO DAGScheduler - Got job 77 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:29.387 INFO DAGScheduler - Final stage: ResultStage 110 (runJob at SparkHadoopWriter.scala:83)
18:46:29.387 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 109)
18:46:29.388 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 109)
18:46:29.388 INFO DAGScheduler - Submitting ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:29.399 INFO MemoryStore - Block broadcast_201 stored as values in memory (estimated size 288.4 KiB, free 1916.6 MiB)
18:46:29.406 INFO BlockManagerInfo - Removed broadcast_194_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:29.406 INFO MemoryStore - Block broadcast_201_piece0 stored as bytes in memory (estimated size 104.7 KiB, free 1917.1 MiB)
18:46:29.406 INFO BlockManagerInfo - Added broadcast_201_piece0 in memory on localhost:45727 (size: 104.7 KiB, free: 1919.4 MiB)
18:46:29.407 INFO SparkContext - Created broadcast 201 from broadcast at DAGScheduler.scala:1580
18:46:29.407 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:29.407 INFO TaskSchedulerImpl - Adding task set 109.0 with 1 tasks resource profile 0
18:46:29.407 INFO BlockManagerInfo - Removed broadcast_191_piece0 on localhost:45727 in memory (size: 56.3 KiB, free: 1919.5 MiB)
18:46:29.407 INFO TaskSetManager - Starting task 0.0 in stage 109.0 (TID 161) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
18:46:29.408 INFO Executor - Running task 0.0 in stage 109.0 (TID 161)
18:46:29.408 INFO BlockManagerInfo - Removed broadcast_186_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:29.408 INFO BlockManagerInfo - Removed broadcast_189_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.5 MiB)
18:46:29.409 INFO BlockManagerInfo - Removed broadcast_192_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:29.410 INFO BlockManagerInfo - Removed broadcast_198_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:29.411 INFO BlockManagerInfo - Removed broadcast_187_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:29.412 INFO BlockManagerInfo - Removed broadcast_193_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.7 MiB)
18:46:29.412 INFO BlockManagerInfo - Removed broadcast_188_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.7 MiB)
18:46:29.413 INFO BlockManagerInfo - Removed broadcast_197_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.7 MiB)
18:46:29.414 INFO BlockManagerInfo - Removed broadcast_190_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.8 MiB)
18:46:29.431 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:29.447 INFO Executor - Finished task 0.0 in stage 109.0 (TID 161). 1149 bytes result sent to driver
18:46:29.448 INFO TaskSetManager - Finished task 0.0 in stage 109.0 (TID 161) in 41 ms on localhost (executor driver) (1/1)
18:46:29.448 INFO TaskSchedulerImpl - Removed TaskSet 109.0, whose tasks have all completed, from pool
18:46:29.448 INFO DAGScheduler - ShuffleMapStage 109 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.060 s
18:46:29.448 INFO DAGScheduler - looking for newly runnable stages
18:46:29.448 INFO DAGScheduler - running: HashSet()
18:46:29.448 INFO DAGScheduler - waiting: HashSet(ResultStage 110)
18:46:29.448 INFO DAGScheduler - failed: HashSet()
18:46:29.448 INFO DAGScheduler - Submitting ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:29.459 INFO MemoryStore - Block broadcast_202 stored as values in memory (estimated size 150.3 KiB, free 1919.1 MiB)
18:46:29.460 INFO MemoryStore - Block broadcast_202_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1919.1 MiB)
18:46:29.460 INFO BlockManagerInfo - Added broadcast_202_piece0 in memory on localhost:45727 (size: 56.3 KiB, free: 1919.8 MiB)
18:46:29.461 INFO SparkContext - Created broadcast 202 from broadcast at DAGScheduler.scala:1580
18:46:29.461 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.461 INFO TaskSchedulerImpl - Adding task set 110.0 with 2 tasks resource profile 0
18:46:29.461 INFO TaskSetManager - Starting task 0.0 in stage 110.0 (TID 162) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:29.462 INFO TaskSetManager - Starting task 1.0 in stage 110.0 (TID 163) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:29.462 INFO Executor - Running task 0.0 in stage 110.0 (TID 162)
18:46:29.462 INFO Executor - Running task 1.0 in stage 110.0 (TID 163)
18:46:29.466 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.466 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.466 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.466 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.466 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.466 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.468 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.468 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.468 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.468 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.468 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.468 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.477 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.478 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.479 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.479 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.483 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846298472394724695587405_0495_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest57442520451485578800.cram/_temporary/0/task_202505191846298472394724695587405_0495_r_000000
18:46:29.483 INFO SparkHadoopMapRedUtil - attempt_202505191846298472394724695587405_0495_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:29.483 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846298472394724695587405_0495_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest57442520451485578800.cram/_temporary/0/task_202505191846298472394724695587405_0495_r_000001
18:46:29.483 INFO SparkHadoopMapRedUtil - attempt_202505191846298472394724695587405_0495_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:29.483 INFO Executor - Finished task 0.0 in stage 110.0 (TID 162). 1729 bytes result sent to driver
18:46:29.484 INFO TaskSetManager - Finished task 0.0 in stage 110.0 (TID 162) in 23 ms on localhost (executor driver) (1/2)
18:46:29.484 INFO Executor - Finished task 1.0 in stage 110.0 (TID 163). 1729 bytes result sent to driver
18:46:29.484 INFO TaskSetManager - Finished task 1.0 in stage 110.0 (TID 163) in 23 ms on localhost (executor driver) (2/2)
18:46:29.484 INFO TaskSchedulerImpl - Removed TaskSet 110.0, whose tasks have all completed, from pool
18:46:29.484 INFO DAGScheduler - ResultStage 110 (runJob at SparkHadoopWriter.scala:83) finished in 0.035 s
18:46:29.485 INFO DAGScheduler - Job 77 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.485 INFO TaskSchedulerImpl - Killing all running tasks in stage 110: Stage finished
18:46:29.485 INFO DAGScheduler - Job 77 finished: runJob at SparkHadoopWriter.scala:83, took 0.097868 s
18:46:29.485 INFO SparkHadoopWriter - Start to commit write Job job_202505191846298472394724695587405_0495.
18:46:29.490 INFO SparkHadoopWriter - Write Job job_202505191846298472394724695587405_0495 committed. Elapsed time: 5 ms.
18:46:29.492 INFO MemoryStore - Block broadcast_203 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
18:46:29.498 INFO MemoryStore - Block broadcast_203_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
18:46:29.499 INFO BlockManagerInfo - Added broadcast_203_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:29.499 INFO SparkContext - Created broadcast 203 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.522 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:29.522 INFO DAGScheduler - Got job 78 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:29.522 INFO DAGScheduler - Final stage: ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222)
18:46:29.522 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 111)
18:46:29.522 INFO DAGScheduler - Missing parents: List()
18:46:29.522 INFO DAGScheduler - Submitting ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:29.523 INFO MemoryStore - Block broadcast_204 stored as values in memory (estimated size 6.3 KiB, free 1918.7 MiB)
18:46:29.524 INFO MemoryStore - Block broadcast_204_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.7 MiB)
18:46:29.524 INFO BlockManagerInfo - Added broadcast_204_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.7 MiB)
18:46:29.524 INFO SparkContext - Created broadcast 204 from broadcast at DAGScheduler.scala:1580
18:46:29.524 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.524 INFO TaskSchedulerImpl - Adding task set 112.0 with 2 tasks resource profile 0
18:46:29.525 INFO TaskSetManager - Starting task 0.0 in stage 112.0 (TID 164) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:29.525 INFO TaskSetManager - Starting task 1.0 in stage 112.0 (TID 165) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:29.525 INFO Executor - Running task 0.0 in stage 112.0 (TID 164)
18:46:29.525 INFO Executor - Running task 1.0 in stage 112.0 (TID 165)
18:46:29.527 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.527 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.527 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.527 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.530 INFO Executor - Finished task 0.0 in stage 112.0 (TID 164). 1591 bytes result sent to driver
18:46:29.530 INFO Executor - Finished task 1.0 in stage 112.0 (TID 165). 1591 bytes result sent to driver
18:46:29.531 INFO TaskSetManager - Finished task 0.0 in stage 112.0 (TID 164) in 6 ms on localhost (executor driver) (1/2)
18:46:29.531 INFO TaskSetManager - Finished task 1.0 in stage 112.0 (TID 165) in 6 ms on localhost (executor driver) (2/2)
18:46:29.531 INFO TaskSchedulerImpl - Removed TaskSet 112.0, whose tasks have all completed, from pool
18:46:29.531 INFO DAGScheduler - ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.008 s
18:46:29.531 INFO DAGScheduler - Job 78 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.531 INFO TaskSchedulerImpl - Killing all running tasks in stage 112: Stage finished
18:46:29.531 INFO DAGScheduler - Job 78 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.009685 s
18:46:29.543 INFO FileInputFormat - Total input files to process : 2
18:46:29.546 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:29.547 INFO DAGScheduler - Got job 79 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:29.547 INFO DAGScheduler - Final stage: ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222)
18:46:29.547 INFO DAGScheduler - Parents of final stage: List()
18:46:29.547 INFO DAGScheduler - Missing parents: List()
18:46:29.547 INFO DAGScheduler - Submitting ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:29.564 INFO MemoryStore - Block broadcast_205 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
18:46:29.566 INFO MemoryStore - Block broadcast_205_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.1 MiB)
18:46:29.566 INFO BlockManagerInfo - Added broadcast_205_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.6 MiB)
18:46:29.566 INFO SparkContext - Created broadcast 205 from broadcast at DAGScheduler.scala:1580
18:46:29.566 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.566 INFO TaskSchedulerImpl - Adding task set 113.0 with 2 tasks resource profile 0
18:46:29.567 INFO TaskSetManager - Starting task 0.0 in stage 113.0 (TID 166) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
18:46:29.567 INFO TaskSetManager - Starting task 1.0 in stage 113.0 (TID 167) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
18:46:29.567 INFO Executor - Running task 0.0 in stage 113.0 (TID 166)
18:46:29.567 INFO Executor - Running task 1.0 in stage 113.0 (TID 167)
18:46:29.597 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest57442520451485578800.cram/part-r-00000.bam:0+31473
18:46:29.599 INFO Executor - Finished task 1.0 in stage 113.0 (TID 167). 989 bytes result sent to driver
18:46:29.600 INFO TaskSetManager - Finished task 1.0 in stage 113.0 (TID 167) in 33 ms on localhost (executor driver) (1/2)
18:46:29.609 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest57442520451485578800.cram/part-r-00001.bam:0+30825
18:46:29.612 INFO Executor - Finished task 0.0 in stage 113.0 (TID 166). 989 bytes result sent to driver
18:46:29.612 INFO TaskSetManager - Finished task 0.0 in stage 113.0 (TID 166) in 46 ms on localhost (executor driver) (2/2)
18:46:29.612 INFO TaskSchedulerImpl - Removed TaskSet 113.0, whose tasks have all completed, from pool
18:46:29.612 INFO DAGScheduler - ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.065 s
18:46:29.613 INFO DAGScheduler - Job 79 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.613 INFO TaskSchedulerImpl - Killing all running tasks in stage 113: Stage finished
18:46:29.613 INFO DAGScheduler - Job 79 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.066156 s
18:46:29.616 INFO MemoryStore - Block broadcast_206 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
18:46:29.623 INFO MemoryStore - Block broadcast_206_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
18:46:29.623 INFO BlockManagerInfo - Added broadcast_206_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:29.623 INFO SparkContext - Created broadcast 206 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.646 INFO MemoryStore - Block broadcast_207 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
18:46:29.653 INFO MemoryStore - Block broadcast_207_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
18:46:29.653 INFO BlockManagerInfo - Added broadcast_207_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:29.653 INFO SparkContext - Created broadcast 207 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.673 INFO MemoryStore - Block broadcast_208 stored as values in memory (estimated size 160.7 KiB, free 1917.3 MiB)
18:46:29.674 INFO MemoryStore - Block broadcast_208_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
18:46:29.674 INFO BlockManagerInfo - Added broadcast_208_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:29.674 INFO SparkContext - Created broadcast 208 from broadcast at ReadsSparkSink.java:133
18:46:29.676 INFO MemoryStore - Block broadcast_209 stored as values in memory (estimated size 163.2 KiB, free 1917.1 MiB)
18:46:29.676 INFO MemoryStore - Block broadcast_209_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.1 MiB)
18:46:29.676 INFO BlockManagerInfo - Added broadcast_209_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:29.677 INFO SparkContext - Created broadcast 209 from broadcast at AnySamSinkMultiple.java:80
18:46:29.678 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.678 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.678 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.690 INFO FileInputFormat - Total input files to process : 1
18:46:29.696 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:29.697 INFO DAGScheduler - Registering RDD 510 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 24
18:46:29.697 INFO DAGScheduler - Got job 80 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
18:46:29.697 INFO DAGScheduler - Final stage: ResultStage 115 (runJob at SparkHadoopWriter.scala:83)
18:46:29.697 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 114)
18:46:29.697 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 114)
18:46:29.697 INFO DAGScheduler - Submitting ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:29.715 INFO MemoryStore - Block broadcast_210 stored as values in memory (estimated size 427.7 KiB, free 1916.7 MiB)
18:46:29.716 INFO MemoryStore - Block broadcast_210_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.6 MiB)
18:46:29.716 INFO BlockManagerInfo - Added broadcast_210_piece0 in memory on localhost:45727 (size: 154.6 KiB, free: 1919.3 MiB)
18:46:29.716 INFO SparkContext - Created broadcast 210 from broadcast at DAGScheduler.scala:1580
18:46:29.717 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
18:46:29.717 INFO TaskSchedulerImpl - Adding task set 114.0 with 1 tasks resource profile 0
18:46:29.717 INFO TaskSetManager - Starting task 0.0 in stage 114.0 (TID 168) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:29.718 INFO Executor - Running task 0.0 in stage 114.0 (TID 168)
18:46:29.747 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:29.763 INFO Executor - Finished task 0.0 in stage 114.0 (TID 168). 1149 bytes result sent to driver
18:46:29.764 INFO TaskSetManager - Finished task 0.0 in stage 114.0 (TID 168) in 47 ms on localhost (executor driver) (1/1)
18:46:29.764 INFO TaskSchedulerImpl - Removed TaskSet 114.0, whose tasks have all completed, from pool
18:46:29.764 INFO DAGScheduler - ShuffleMapStage 114 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.066 s
18:46:29.764 INFO DAGScheduler - looking for newly runnable stages
18:46:29.764 INFO DAGScheduler - running: HashSet()
18:46:29.764 INFO DAGScheduler - waiting: HashSet(ResultStage 115)
18:46:29.764 INFO DAGScheduler - failed: HashSet()
18:46:29.764 INFO DAGScheduler - Submitting ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
18:46:29.775 INFO MemoryStore - Block broadcast_211 stored as values in memory (estimated size 150.2 KiB, free 1916.4 MiB)
18:46:29.776 INFO MemoryStore - Block broadcast_211_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1916.4 MiB)
18:46:29.776 INFO BlockManagerInfo - Added broadcast_211_piece0 in memory on localhost:45727 (size: 56.2 KiB, free: 1919.3 MiB)
18:46:29.776 INFO SparkContext - Created broadcast 211 from broadcast at DAGScheduler.scala:1580
18:46:29.777 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.777 INFO TaskSchedulerImpl - Adding task set 115.0 with 2 tasks resource profile 0
18:46:29.777 INFO TaskSetManager - Starting task 0.0 in stage 115.0 (TID 169) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:29.777 INFO TaskSetManager - Starting task 1.0 in stage 115.0 (TID 170) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:29.778 INFO Executor - Running task 0.0 in stage 115.0 (TID 169)
18:46:29.778 INFO Executor - Running task 1.0 in stage 115.0 (TID 170)
18:46:29.784 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.784 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.784 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.784 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.784 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.784 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.784 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.784 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:29.784 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.784 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:29.784 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.784 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:29.795 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.795 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.799 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.799 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.801 INFO FileOutputCommitter - Saved output of task 'attempt_20250519184629159956174231836740_0522_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest62479126793021640266.sam/_temporary/0/task_20250519184629159956174231836740_0522_r_000000
18:46:29.801 INFO SparkHadoopMapRedUtil - attempt_20250519184629159956174231836740_0522_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:29.802 INFO Executor - Finished task 0.0 in stage 115.0 (TID 169). 1729 bytes result sent to driver
18:46:29.802 INFO TaskSetManager - Finished task 0.0 in stage 115.0 (TID 169) in 25 ms on localhost (executor driver) (1/2)
18:46:29.807 INFO FileOutputCommitter - Saved output of task 'attempt_20250519184629159956174231836740_0522_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest62479126793021640266.sam/_temporary/0/task_20250519184629159956174231836740_0522_r_000001
18:46:29.807 INFO SparkHadoopMapRedUtil - attempt_20250519184629159956174231836740_0522_r_000001_0: Committed. Elapsed time: 0 ms.
18:46:29.807 INFO Executor - Finished task 1.0 in stage 115.0 (TID 170). 1729 bytes result sent to driver
18:46:29.808 INFO TaskSetManager - Finished task 1.0 in stage 115.0 (TID 170) in 31 ms on localhost (executor driver) (2/2)
18:46:29.808 INFO DAGScheduler - ResultStage 115 (runJob at SparkHadoopWriter.scala:83) finished in 0.043 s
18:46:29.808 INFO DAGScheduler - Job 80 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.808 INFO TaskSchedulerImpl - Removed TaskSet 115.0, whose tasks have all completed, from pool
18:46:29.808 INFO TaskSchedulerImpl - Killing all running tasks in stage 115: Stage finished
18:46:29.808 INFO DAGScheduler - Job 80 finished: runJob at SparkHadoopWriter.scala:83, took 0.111821 s
18:46:29.809 INFO SparkHadoopWriter - Start to commit write Job job_20250519184629159956174231836740_0522.
18:46:29.815 INFO SparkHadoopWriter - Write Job job_20250519184629159956174231836740_0522 committed. Elapsed time: 6 ms.
18:46:29.819 INFO MemoryStore - Block broadcast_212 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
18:46:29.829 INFO MemoryStore - Block broadcast_212_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
18:46:29.829 INFO BlockManagerInfo - Added broadcast_212_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:29.829 INFO SparkContext - Created broadcast 212 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.866 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:29.866 INFO DAGScheduler - Got job 81 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:29.866 INFO DAGScheduler - Final stage: ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222)
18:46:29.866 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 116)
18:46:29.866 INFO DAGScheduler - Missing parents: List()
18:46:29.867 INFO DAGScheduler - Submitting ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
18:46:29.867 INFO MemoryStore - Block broadcast_213 stored as values in memory (estimated size 6.3 KiB, free 1916.0 MiB)
18:46:29.868 INFO MemoryStore - Block broadcast_213_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.0 MiB)
18:46:29.868 INFO BlockManagerInfo - Added broadcast_213_piece0 in memory on localhost:45727 (size: 3.4 KiB, free: 1919.2 MiB)
18:46:29.868 INFO SparkContext - Created broadcast 213 from broadcast at DAGScheduler.scala:1580
18:46:29.868 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.868 INFO TaskSchedulerImpl - Adding task set 117.0 with 2 tasks resource profile 0
18:46:29.869 INFO TaskSetManager - Starting task 0.0 in stage 117.0 (TID 171) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
18:46:29.869 INFO TaskSetManager - Starting task 1.0 in stage 117.0 (TID 172) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
18:46:29.869 INFO Executor - Running task 0.0 in stage 117.0 (TID 171)
18:46:29.869 INFO Executor - Running task 1.0 in stage 117.0 (TID 172)
18:46:29.871 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.871 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.871 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:29.871 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:29.882 INFO Executor - Finished task 0.0 in stage 117.0 (TID 171). 1677 bytes result sent to driver
18:46:29.882 INFO Executor - Finished task 1.0 in stage 117.0 (TID 172). 1634 bytes result sent to driver
18:46:29.883 INFO TaskSetManager - Finished task 0.0 in stage 117.0 (TID 171) in 14 ms on localhost (executor driver) (1/2)
18:46:29.883 INFO TaskSetManager - Finished task 1.0 in stage 117.0 (TID 172) in 14 ms on localhost (executor driver) (2/2)
18:46:29.883 INFO TaskSchedulerImpl - Removed TaskSet 117.0, whose tasks have all completed, from pool
18:46:29.883 INFO DAGScheduler - ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.016 s
18:46:29.883 INFO DAGScheduler - Job 81 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.883 INFO TaskSchedulerImpl - Killing all running tasks in stage 117: Stage finished
18:46:29.883 INFO BlockManagerInfo - Removed broadcast_207_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:29.884 INFO DAGScheduler - Job 81 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.017704 s
18:46:29.884 INFO BlockManagerInfo - Removed broadcast_209_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:29.885 INFO BlockManagerInfo - Removed broadcast_208_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:29.886 INFO BlockManagerInfo - Removed broadcast_210_piece0 on localhost:45727 in memory (size: 154.6 KiB, free: 1919.4 MiB)
18:46:29.886 INFO BlockManagerInfo - Removed broadcast_203_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:29.887 INFO BlockManagerInfo - Removed broadcast_199_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.5 MiB)
18:46:29.887 INFO BlockManagerInfo - Removed broadcast_196_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:29.888 INFO BlockManagerInfo - Removed broadcast_195_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.5 MiB)
18:46:29.889 INFO BlockManagerInfo - Removed broadcast_201_piece0 on localhost:45727 in memory (size: 104.7 KiB, free: 1919.6 MiB)
18:46:29.889 INFO BlockManagerInfo - Removed broadcast_204_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.6 MiB)
18:46:29.890 INFO BlockManagerInfo - Removed broadcast_211_piece0 on localhost:45727 in memory (size: 56.2 KiB, free: 1919.7 MiB)
18:46:29.891 INFO BlockManagerInfo - Removed broadcast_200_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.7 MiB)
18:46:29.892 INFO BlockManagerInfo - Removed broadcast_202_piece0 on localhost:45727 in memory (size: 56.3 KiB, free: 1919.7 MiB)
18:46:29.892 INFO BlockManagerInfo - Removed broadcast_205_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.9 MiB)
18:46:29.899 INFO FileInputFormat - Total input files to process : 2
18:46:29.902 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
18:46:29.902 INFO DAGScheduler - Got job 82 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
18:46:29.902 INFO DAGScheduler - Final stage: ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222)
18:46:29.902 INFO DAGScheduler - Parents of final stage: List()
18:46:29.902 INFO DAGScheduler - Missing parents: List()
18:46:29.903 INFO DAGScheduler - Submitting ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:29.920 INFO MemoryStore - Block broadcast_214 stored as values in memory (estimated size 426.1 KiB, free 1918.9 MiB)
18:46:29.922 INFO MemoryStore - Block broadcast_214_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.7 MiB)
18:46:29.922 INFO BlockManagerInfo - Added broadcast_214_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.7 MiB)
18:46:29.922 INFO SparkContext - Created broadcast 214 from broadcast at DAGScheduler.scala:1580
18:46:29.922 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
18:46:29.922 INFO TaskSchedulerImpl - Adding task set 118.0 with 2 tasks resource profile 0
18:46:29.923 INFO TaskSetManager - Starting task 0.0 in stage 118.0 (TID 173) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
18:46:29.923 INFO TaskSetManager - Starting task 1.0 in stage 118.0 (TID 174) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
18:46:29.923 INFO Executor - Running task 0.0 in stage 118.0 (TID 173)
18:46:29.923 INFO Executor - Running task 1.0 in stage 118.0 (TID 174)
18:46:29.953 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest62479126793021640266.sam/part-r-00000.bam:0+132492
18:46:29.953 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest62479126793021640266.sam/part-r-00001.bam:0+129330
18:46:29.962 INFO Executor - Finished task 1.0 in stage 118.0 (TID 174). 989 bytes result sent to driver
18:46:29.962 INFO Executor - Finished task 0.0 in stage 118.0 (TID 173). 989 bytes result sent to driver
18:46:29.963 INFO TaskSetManager - Finished task 0.0 in stage 118.0 (TID 173) in 40 ms on localhost (executor driver) (1/2)
18:46:29.963 INFO TaskSetManager - Finished task 1.0 in stage 118.0 (TID 174) in 40 ms on localhost (executor driver) (2/2)
18:46:29.963 INFO TaskSchedulerImpl - Removed TaskSet 118.0, whose tasks have all completed, from pool
18:46:29.963 INFO DAGScheduler - ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.060 s
18:46:29.964 INFO DAGScheduler - Job 82 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:29.964 INFO TaskSchedulerImpl - Killing all running tasks in stage 118: Stage finished
18:46:29.964 INFO DAGScheduler - Job 82 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.061776 s
18:46:29.968 INFO MemoryStore - Block broadcast_215 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
18:46:29.974 INFO MemoryStore - Block broadcast_215_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
18:46:29.974 INFO BlockManagerInfo - Added broadcast_215_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:29.974 INFO SparkContext - Created broadcast 215 from newAPIHadoopFile at PathSplitSource.java:96
18:46:29.997 INFO MemoryStore - Block broadcast_216 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:30.003 INFO MemoryStore - Block broadcast_216_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
18:46:30.003 INFO BlockManagerInfo - Added broadcast_216_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:30.003 INFO SparkContext - Created broadcast 216 from newAPIHadoopFile at PathSplitSource.java:96
18:46:30.023 INFO FileInputFormat - Total input files to process : 1
18:46:30.026 INFO MemoryStore - Block broadcast_217 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
18:46:30.027 INFO MemoryStore - Block broadcast_217_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
18:46:30.027 INFO BlockManagerInfo - Added broadcast_217_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:30.027 INFO SparkContext - Created broadcast 217 from broadcast at ReadsSparkSink.java:133
18:46:30.028 INFO MemoryStore - Block broadcast_218 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
18:46:30.029 INFO MemoryStore - Block broadcast_218_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
18:46:30.029 INFO BlockManagerInfo - Added broadcast_218_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:30.029 INFO SparkContext - Created broadcast 218 from broadcast at BamSink.java:76
18:46:30.031 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.031 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.031 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.048 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:30.049 INFO DAGScheduler - Registering RDD 543 (mapToPair at SparkUtils.java:161) as input to shuffle 25
18:46:30.049 INFO DAGScheduler - Got job 83 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:30.049 INFO DAGScheduler - Final stage: ResultStage 120 (runJob at SparkHadoopWriter.scala:83)
18:46:30.049 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 119)
18:46:30.049 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 119)
18:46:30.050 INFO DAGScheduler - Submitting ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:30.074 INFO MemoryStore - Block broadcast_219 stored as values in memory (estimated size 520.4 KiB, free 1917.2 MiB)
18:46:30.075 INFO MemoryStore - Block broadcast_219_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.1 MiB)
18:46:30.075 INFO BlockManagerInfo - Added broadcast_219_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.5 MiB)
18:46:30.075 INFO SparkContext - Created broadcast 219 from broadcast at DAGScheduler.scala:1580
18:46:30.076 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:30.076 INFO TaskSchedulerImpl - Adding task set 119.0 with 1 tasks resource profile 0
18:46:30.076 INFO TaskSetManager - Starting task 0.0 in stage 119.0 (TID 175) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:30.077 INFO Executor - Running task 0.0 in stage 119.0 (TID 175)
18:46:30.110 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:30.128 INFO Executor - Finished task 0.0 in stage 119.0 (TID 175). 1148 bytes result sent to driver
18:46:30.128 INFO TaskSetManager - Finished task 0.0 in stage 119.0 (TID 175) in 52 ms on localhost (executor driver) (1/1)
18:46:30.128 INFO TaskSchedulerImpl - Removed TaskSet 119.0, whose tasks have all completed, from pool
18:46:30.129 INFO DAGScheduler - ShuffleMapStage 119 (mapToPair at SparkUtils.java:161) finished in 0.078 s
18:46:30.129 INFO DAGScheduler - looking for newly runnable stages
18:46:30.129 INFO DAGScheduler - running: HashSet()
18:46:30.129 INFO DAGScheduler - waiting: HashSet(ResultStage 120)
18:46:30.129 INFO DAGScheduler - failed: HashSet()
18:46:30.129 INFO DAGScheduler - Submitting ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91), which has no missing parents
18:46:30.137 INFO MemoryStore - Block broadcast_220 stored as values in memory (estimated size 241.4 KiB, free 1916.8 MiB)
18:46:30.138 INFO MemoryStore - Block broadcast_220_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.8 MiB)
18:46:30.138 INFO BlockManagerInfo - Added broadcast_220_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.4 MiB)
18:46:30.138 INFO SparkContext - Created broadcast 220 from broadcast at DAGScheduler.scala:1580
18:46:30.138 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:30.139 INFO TaskSchedulerImpl - Adding task set 120.0 with 1 tasks resource profile 0
18:46:30.139 INFO TaskSetManager - Starting task 0.0 in stage 120.0 (TID 176) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:30.140 INFO Executor - Running task 0.0 in stage 120.0 (TID 176)
18:46:30.144 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:30.144 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:30.158 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.158 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.158 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.158 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.158 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.158 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.184 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846305578567175375312661_0548_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest12164828251672388236.bam.parts/_temporary/0/task_202505191846305578567175375312661_0548_r_000000
18:46:30.184 INFO SparkHadoopMapRedUtil - attempt_202505191846305578567175375312661_0548_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:30.184 INFO Executor - Finished task 0.0 in stage 120.0 (TID 176). 1858 bytes result sent to driver
18:46:30.185 INFO TaskSetManager - Finished task 0.0 in stage 120.0 (TID 176) in 46 ms on localhost (executor driver) (1/1)
18:46:30.185 INFO TaskSchedulerImpl - Removed TaskSet 120.0, whose tasks have all completed, from pool
18:46:30.185 INFO DAGScheduler - ResultStage 120 (runJob at SparkHadoopWriter.scala:83) finished in 0.056 s
18:46:30.185 INFO DAGScheduler - Job 83 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.185 INFO TaskSchedulerImpl - Killing all running tasks in stage 120: Stage finished
18:46:30.185 INFO DAGScheduler - Job 83 finished: runJob at SparkHadoopWriter.scala:83, took 0.136525 s
18:46:30.185 INFO SparkHadoopWriter - Start to commit write Job job_202505191846305578567175375312661_0548.
18:46:30.191 INFO SparkHadoopWriter - Write Job job_202505191846305578567175375312661_0548 committed. Elapsed time: 5 ms.
18:46:30.203 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest12164828251672388236.bam
18:46:30.207 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest12164828251672388236.bam done
18:46:30.208 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest12164828251672388236.bam.parts/ to /tmp/ReadsSparkSinkUnitTest12164828251672388236.bam.sbi
18:46:30.212 INFO IndexFileMerger - Done merging .sbi files
18:46:30.212 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest12164828251672388236.bam.parts/ to /tmp/ReadsSparkSinkUnitTest12164828251672388236.bam.bai
18:46:30.218 INFO IndexFileMerger - Done merging .bai files
18:46:30.220 INFO MemoryStore - Block broadcast_221 stored as values in memory (estimated size 320.0 B, free 1916.8 MiB)
18:46:30.221 INFO MemoryStore - Block broadcast_221_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.8 MiB)
18:46:30.221 INFO BlockManagerInfo - Added broadcast_221_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.4 MiB)
18:46:30.221 INFO SparkContext - Created broadcast 221 from broadcast at BamSource.java:104
18:46:30.222 INFO MemoryStore - Block broadcast_222 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
18:46:30.232 INFO MemoryStore - Block broadcast_222_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:30.232 INFO BlockManagerInfo - Added broadcast_222_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:30.233 INFO SparkContext - Created broadcast 222 from newAPIHadoopFile at PathSplitSource.java:96
18:46:30.247 INFO FileInputFormat - Total input files to process : 1
18:46:30.261 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:30.261 INFO DAGScheduler - Got job 84 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:30.261 INFO DAGScheduler - Final stage: ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:30.261 INFO DAGScheduler - Parents of final stage: List()
18:46:30.261 INFO DAGScheduler - Missing parents: List()
18:46:30.262 INFO DAGScheduler - Submitting ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:30.272 INFO MemoryStore - Block broadcast_223 stored as values in memory (estimated size 148.2 KiB, free 1916.3 MiB)
18:46:30.273 INFO MemoryStore - Block broadcast_223_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.2 MiB)
18:46:30.273 INFO BlockManagerInfo - Added broadcast_223_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.3 MiB)
18:46:30.273 INFO SparkContext - Created broadcast 223 from broadcast at DAGScheduler.scala:1580
18:46:30.273 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:30.273 INFO TaskSchedulerImpl - Adding task set 121.0 with 1 tasks resource profile 0
18:46:30.274 INFO TaskSetManager - Starting task 0.0 in stage 121.0 (TID 177) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:30.274 INFO Executor - Running task 0.0 in stage 121.0 (TID 177)
18:46:30.286 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest12164828251672388236.bam:0+237038
18:46:30.292 INFO Executor - Finished task 0.0 in stage 121.0 (TID 177). 651526 bytes result sent to driver
18:46:30.294 INFO TaskSetManager - Finished task 0.0 in stage 121.0 (TID 177) in 20 ms on localhost (executor driver) (1/1)
18:46:30.294 INFO TaskSchedulerImpl - Removed TaskSet 121.0, whose tasks have all completed, from pool
18:46:30.294 INFO DAGScheduler - ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.032 s
18:46:30.294 INFO DAGScheduler - Job 84 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.294 INFO TaskSchedulerImpl - Killing all running tasks in stage 121: Stage finished
18:46:30.294 INFO DAGScheduler - Job 84 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.033221 s
18:46:30.308 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:30.308 INFO DAGScheduler - Got job 85 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:30.308 INFO DAGScheduler - Final stage: ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185)
18:46:30.308 INFO DAGScheduler - Parents of final stage: List()
18:46:30.308 INFO DAGScheduler - Missing parents: List()
18:46:30.308 INFO DAGScheduler - Submitting ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:30.325 INFO MemoryStore - Block broadcast_224 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
18:46:30.326 INFO MemoryStore - Block broadcast_224_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
18:46:30.327 INFO BlockManagerInfo - Added broadcast_224_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.2 MiB)
18:46:30.327 INFO SparkContext - Created broadcast 224 from broadcast at DAGScheduler.scala:1580
18:46:30.327 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:30.327 INFO TaskSchedulerImpl - Adding task set 122.0 with 1 tasks resource profile 0
18:46:30.327 INFO TaskSetManager - Starting task 0.0 in stage 122.0 (TID 178) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:30.328 INFO Executor - Running task 0.0 in stage 122.0 (TID 178)
18:46:30.357 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:30.368 INFO Executor - Finished task 0.0 in stage 122.0 (TID 178). 989 bytes result sent to driver
18:46:30.368 INFO TaskSetManager - Finished task 0.0 in stage 122.0 (TID 178) in 41 ms on localhost (executor driver) (1/1)
18:46:30.369 INFO TaskSchedulerImpl - Removed TaskSet 122.0, whose tasks have all completed, from pool
18:46:30.369 INFO DAGScheduler - ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.061 s
18:46:30.369 INFO DAGScheduler - Job 85 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.369 INFO TaskSchedulerImpl - Killing all running tasks in stage 122: Stage finished
18:46:30.369 INFO DAGScheduler - Job 85 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.061167 s
18:46:30.372 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:30.373 INFO DAGScheduler - Got job 86 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:30.373 INFO DAGScheduler - Final stage: ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185)
18:46:30.373 INFO DAGScheduler - Parents of final stage: List()
18:46:30.373 INFO DAGScheduler - Missing parents: List()
18:46:30.373 INFO DAGScheduler - Submitting ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:30.379 INFO MemoryStore - Block broadcast_225 stored as values in memory (estimated size 148.1 KiB, free 1915.5 MiB)
18:46:30.380 INFO MemoryStore - Block broadcast_225_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.5 MiB)
18:46:30.380 INFO BlockManagerInfo - Added broadcast_225_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.1 MiB)
18:46:30.380 INFO SparkContext - Created broadcast 225 from broadcast at DAGScheduler.scala:1580
18:46:30.380 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:30.380 INFO TaskSchedulerImpl - Adding task set 123.0 with 1 tasks resource profile 0
18:46:30.381 INFO TaskSetManager - Starting task 0.0 in stage 123.0 (TID 179) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:30.381 INFO Executor - Running task 0.0 in stage 123.0 (TID 179)
18:46:30.392 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest12164828251672388236.bam:0+237038
18:46:30.395 INFO Executor - Finished task 0.0 in stage 123.0 (TID 179). 989 bytes result sent to driver
18:46:30.396 INFO TaskSetManager - Finished task 0.0 in stage 123.0 (TID 179) in 15 ms on localhost (executor driver) (1/1)
18:46:30.396 INFO TaskSchedulerImpl - Removed TaskSet 123.0, whose tasks have all completed, from pool
18:46:30.396 INFO DAGScheduler - ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
18:46:30.396 INFO DAGScheduler - Job 86 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.396 INFO TaskSchedulerImpl - Killing all running tasks in stage 123: Stage finished
18:46:30.396 INFO DAGScheduler - Job 86 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.023908 s
18:46:30.399 INFO MemoryStore - Block broadcast_226 stored as values in memory (estimated size 297.9 KiB, free 1915.2 MiB)
18:46:30.405 INFO MemoryStore - Block broadcast_226_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.1 MiB)
18:46:30.405 INFO BlockManagerInfo - Added broadcast_226_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.0 MiB)
18:46:30.405 INFO SparkContext - Created broadcast 226 from newAPIHadoopFile at PathSplitSource.java:96
18:46:30.426 INFO MemoryStore - Block broadcast_227 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
18:46:30.435 INFO BlockManagerInfo - Removed broadcast_214_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.2 MiB)
18:46:30.435 INFO BlockManagerInfo - Removed broadcast_222_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:30.436 INFO BlockManagerInfo - Removed broadcast_206_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:30.436 INFO BlockManagerInfo - Removed broadcast_224_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:30.437 INFO BlockManagerInfo - Removed broadcast_219_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.6 MiB)
18:46:30.437 INFO BlockManagerInfo - Removed broadcast_218_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:30.438 INFO BlockManagerInfo - Removed broadcast_215_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:30.438 INFO BlockManagerInfo - Removed broadcast_221_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.7 MiB)
18:46:30.439 INFO BlockManagerInfo - Removed broadcast_225_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.7 MiB)
18:46:30.439 INFO BlockManagerInfo - Removed broadcast_213_piece0 on localhost:45727 in memory (size: 3.4 KiB, free: 1919.7 MiB)
18:46:30.440 INFO BlockManagerInfo - Removed broadcast_212_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:30.440 INFO BlockManagerInfo - Removed broadcast_217_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:30.440 INFO BlockManagerInfo - Removed broadcast_216_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:30.442 INFO MemoryStore - Block broadcast_227_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
18:46:30.442 INFO BlockManagerInfo - Added broadcast_227_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:30.442 INFO SparkContext - Created broadcast 227 from newAPIHadoopFile at PathSplitSource.java:96
18:46:30.442 INFO BlockManagerInfo - Removed broadcast_220_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.8 MiB)
18:46:30.443 INFO BlockManagerInfo - Removed broadcast_223_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.9 MiB)
18:46:30.463 INFO FileInputFormat - Total input files to process : 1
18:46:30.464 INFO MemoryStore - Block broadcast_228 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
18:46:30.465 INFO MemoryStore - Block broadcast_228_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
18:46:30.465 INFO BlockManagerInfo - Added broadcast_228_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.9 MiB)
18:46:30.465 INFO SparkContext - Created broadcast 228 from broadcast at ReadsSparkSink.java:133
18:46:30.467 INFO MemoryStore - Block broadcast_229 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
18:46:30.467 INFO MemoryStore - Block broadcast_229_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
18:46:30.468 INFO BlockManagerInfo - Added broadcast_229_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.9 MiB)
18:46:30.468 INFO SparkContext - Created broadcast 229 from broadcast at BamSink.java:76
18:46:30.469 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.470 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.470 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.486 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:30.487 INFO DAGScheduler - Registering RDD 568 (mapToPair at SparkUtils.java:161) as input to shuffle 26
18:46:30.487 INFO DAGScheduler - Got job 87 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:30.487 INFO DAGScheduler - Final stage: ResultStage 125 (runJob at SparkHadoopWriter.scala:83)
18:46:30.487 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 124)
18:46:30.487 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 124)
18:46:30.487 INFO DAGScheduler - Submitting ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:30.505 INFO MemoryStore - Block broadcast_230 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
18:46:30.506 INFO MemoryStore - Block broadcast_230_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
18:46:30.506 INFO BlockManagerInfo - Added broadcast_230_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.7 MiB)
18:46:30.506 INFO SparkContext - Created broadcast 230 from broadcast at DAGScheduler.scala:1580
18:46:30.506 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:30.506 INFO TaskSchedulerImpl - Adding task set 124.0 with 1 tasks resource profile 0
18:46:30.507 INFO TaskSetManager - Starting task 0.0 in stage 124.0 (TID 180) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:30.507 INFO Executor - Running task 0.0 in stage 124.0 (TID 180)
18:46:30.537 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:30.552 INFO Executor - Finished task 0.0 in stage 124.0 (TID 180). 1148 bytes result sent to driver
18:46:30.552 INFO TaskSetManager - Finished task 0.0 in stage 124.0 (TID 180) in 45 ms on localhost (executor driver) (1/1)
18:46:30.552 INFO TaskSchedulerImpl - Removed TaskSet 124.0, whose tasks have all completed, from pool
18:46:30.553 INFO DAGScheduler - ShuffleMapStage 124 (mapToPair at SparkUtils.java:161) finished in 0.066 s
18:46:30.553 INFO DAGScheduler - looking for newly runnable stages
18:46:30.553 INFO DAGScheduler - running: HashSet()
18:46:30.553 INFO DAGScheduler - waiting: HashSet(ResultStage 125)
18:46:30.553 INFO DAGScheduler - failed: HashSet()
18:46:30.553 INFO DAGScheduler - Submitting ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91), which has no missing parents
18:46:30.559 INFO MemoryStore - Block broadcast_231 stored as values in memory (estimated size 241.4 KiB, free 1918.1 MiB)
18:46:30.560 INFO MemoryStore - Block broadcast_231_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.0 MiB)
18:46:30.560 INFO BlockManagerInfo - Added broadcast_231_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.7 MiB)
18:46:30.561 INFO SparkContext - Created broadcast 231 from broadcast at DAGScheduler.scala:1580
18:46:30.561 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:30.561 INFO TaskSchedulerImpl - Adding task set 125.0 with 1 tasks resource profile 0
18:46:30.561 INFO TaskSetManager - Starting task 0.0 in stage 125.0 (TID 181) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:30.562 INFO Executor - Running task 0.0 in stage 125.0 (TID 181)
18:46:30.566 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:30.566 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:30.577 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.577 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.577 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.577 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.577 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.577 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.601 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846307846010211794276174_0573_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest11385381728215302356.bam.parts/_temporary/0/task_202505191846307846010211794276174_0573_r_000000
18:46:30.602 INFO SparkHadoopMapRedUtil - attempt_202505191846307846010211794276174_0573_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:30.602 INFO Executor - Finished task 0.0 in stage 125.0 (TID 181). 1858 bytes result sent to driver
18:46:30.602 INFO TaskSetManager - Finished task 0.0 in stage 125.0 (TID 181) in 41 ms on localhost (executor driver) (1/1)
18:46:30.603 INFO TaskSchedulerImpl - Removed TaskSet 125.0, whose tasks have all completed, from pool
18:46:30.603 INFO DAGScheduler - ResultStage 125 (runJob at SparkHadoopWriter.scala:83) finished in 0.050 s
18:46:30.603 INFO DAGScheduler - Job 87 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.603 INFO TaskSchedulerImpl - Killing all running tasks in stage 125: Stage finished
18:46:30.604 INFO DAGScheduler - Job 87 finished: runJob at SparkHadoopWriter.scala:83, took 0.117308 s
18:46:30.604 INFO SparkHadoopWriter - Start to commit write Job job_202505191846307846010211794276174_0573.
18:46:30.609 INFO SparkHadoopWriter - Write Job job_202505191846307846010211794276174_0573 committed. Elapsed time: 5 ms.
18:46:30.623 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest11385381728215302356.bam
18:46:30.628 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest11385381728215302356.bam done
18:46:30.628 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest11385381728215302356.bam.parts/ to /tmp/ReadsSparkSinkUnitTest11385381728215302356.bam.sbi
18:46:30.635 INFO IndexFileMerger - Done merging .sbi files
18:46:30.635 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest11385381728215302356.bam.parts/ to /tmp/ReadsSparkSinkUnitTest11385381728215302356.bam.bai
18:46:30.640 INFO IndexFileMerger - Done merging .bai files
18:46:30.642 INFO MemoryStore - Block broadcast_232 stored as values in memory (estimated size 13.3 KiB, free 1918.0 MiB)
18:46:30.643 INFO MemoryStore - Block broadcast_232_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.0 MiB)
18:46:30.643 INFO BlockManagerInfo - Added broadcast_232_piece0 in memory on localhost:45727 (size: 8.3 KiB, free: 1919.6 MiB)
18:46:30.643 INFO SparkContext - Created broadcast 232 from broadcast at BamSource.java:104
18:46:30.644 INFO MemoryStore - Block broadcast_233 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
18:46:30.650 INFO MemoryStore - Block broadcast_233_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
18:46:30.650 INFO BlockManagerInfo - Added broadcast_233_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:30.650 INFO SparkContext - Created broadcast 233 from newAPIHadoopFile at PathSplitSource.java:96
18:46:30.659 INFO FileInputFormat - Total input files to process : 1
18:46:30.673 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:30.673 INFO DAGScheduler - Got job 88 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:30.673 INFO DAGScheduler - Final stage: ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:30.673 INFO DAGScheduler - Parents of final stage: List()
18:46:30.673 INFO DAGScheduler - Missing parents: List()
18:46:30.673 INFO DAGScheduler - Submitting ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:30.683 INFO MemoryStore - Block broadcast_234 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
18:46:30.684 INFO MemoryStore - Block broadcast_234_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
18:46:30.684 INFO BlockManagerInfo - Added broadcast_234_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.5 MiB)
18:46:30.684 INFO SparkContext - Created broadcast 234 from broadcast at DAGScheduler.scala:1580
18:46:30.684 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:30.684 INFO TaskSchedulerImpl - Adding task set 126.0 with 1 tasks resource profile 0
18:46:30.685 INFO TaskSetManager - Starting task 0.0 in stage 126.0 (TID 182) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:30.685 INFO Executor - Running task 0.0 in stage 126.0 (TID 182)
18:46:30.696 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest11385381728215302356.bam:0+237038
18:46:30.701 INFO Executor - Finished task 0.0 in stage 126.0 (TID 182). 651483 bytes result sent to driver
18:46:30.703 INFO TaskSetManager - Finished task 0.0 in stage 126.0 (TID 182) in 18 ms on localhost (executor driver) (1/1)
18:46:30.703 INFO TaskSchedulerImpl - Removed TaskSet 126.0, whose tasks have all completed, from pool
18:46:30.704 INFO DAGScheduler - ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.031 s
18:46:30.704 INFO DAGScheduler - Job 88 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.704 INFO TaskSchedulerImpl - Killing all running tasks in stage 126: Stage finished
18:46:30.704 INFO DAGScheduler - Job 88 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.031153 s
18:46:30.716 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:30.716 INFO DAGScheduler - Got job 89 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:30.716 INFO DAGScheduler - Final stage: ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185)
18:46:30.716 INFO DAGScheduler - Parents of final stage: List()
18:46:30.716 INFO DAGScheduler - Missing parents: List()
18:46:30.716 INFO DAGScheduler - Submitting ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:30.733 INFO MemoryStore - Block broadcast_235 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
18:46:30.734 INFO MemoryStore - Block broadcast_235_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
18:46:30.734 INFO BlockManagerInfo - Added broadcast_235_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:30.734 INFO SparkContext - Created broadcast 235 from broadcast at DAGScheduler.scala:1580
18:46:30.735 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:30.735 INFO TaskSchedulerImpl - Adding task set 127.0 with 1 tasks resource profile 0
18:46:30.735 INFO TaskSetManager - Starting task 0.0 in stage 127.0 (TID 183) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:30.735 INFO Executor - Running task 0.0 in stage 127.0 (TID 183)
18:46:30.766 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:30.775 INFO Executor - Finished task 0.0 in stage 127.0 (TID 183). 989 bytes result sent to driver
18:46:30.775 INFO TaskSetManager - Finished task 0.0 in stage 127.0 (TID 183) in 40 ms on localhost (executor driver) (1/1)
18:46:30.776 INFO TaskSchedulerImpl - Removed TaskSet 127.0, whose tasks have all completed, from pool
18:46:30.776 INFO DAGScheduler - ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
18:46:30.776 INFO DAGScheduler - Job 89 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.776 INFO TaskSchedulerImpl - Killing all running tasks in stage 127: Stage finished
18:46:30.776 INFO DAGScheduler - Job 89 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060339 s
18:46:30.779 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:30.779 INFO DAGScheduler - Got job 90 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:30.779 INFO DAGScheduler - Final stage: ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185)
18:46:30.779 INFO DAGScheduler - Parents of final stage: List()
18:46:30.779 INFO DAGScheduler - Missing parents: List()
18:46:30.780 INFO DAGScheduler - Submitting ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:30.786 INFO MemoryStore - Block broadcast_236 stored as values in memory (estimated size 148.1 KiB, free 1916.7 MiB)
18:46:30.786 INFO MemoryStore - Block broadcast_236_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1916.7 MiB)
18:46:30.787 INFO BlockManagerInfo - Added broadcast_236_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.3 MiB)
18:46:30.787 INFO SparkContext - Created broadcast 236 from broadcast at DAGScheduler.scala:1580
18:46:30.787 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:30.787 INFO TaskSchedulerImpl - Adding task set 128.0 with 1 tasks resource profile 0
18:46:30.787 INFO TaskSetManager - Starting task 0.0 in stage 128.0 (TID 184) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:30.788 INFO Executor - Running task 0.0 in stage 128.0 (TID 184)
18:46:30.799 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest11385381728215302356.bam:0+237038
18:46:30.803 INFO Executor - Finished task 0.0 in stage 128.0 (TID 184). 989 bytes result sent to driver
18:46:30.803 INFO TaskSetManager - Finished task 0.0 in stage 128.0 (TID 184) in 16 ms on localhost (executor driver) (1/1)
18:46:30.803 INFO TaskSchedulerImpl - Removed TaskSet 128.0, whose tasks have all completed, from pool
18:46:30.803 INFO DAGScheduler - ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
18:46:30.803 INFO DAGScheduler - Job 90 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:30.803 INFO TaskSchedulerImpl - Killing all running tasks in stage 128: Stage finished
18:46:30.803 INFO DAGScheduler - Job 90 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.024162 s
18:46:30.806 INFO MemoryStore - Block broadcast_237 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:30.817 INFO MemoryStore - Block broadcast_237_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:30.817 INFO BlockManagerInfo - Added broadcast_237_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:30.817 INFO SparkContext - Created broadcast 237 from newAPIHadoopFile at PathSplitSource.java:96
18:46:30.845 INFO MemoryStore - Block broadcast_238 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
18:46:30.851 INFO MemoryStore - Block broadcast_238_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
18:46:30.852 INFO BlockManagerInfo - Added broadcast_238_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:30.852 INFO SparkContext - Created broadcast 238 from newAPIHadoopFile at PathSplitSource.java:96
18:46:30.871 INFO FileInputFormat - Total input files to process : 1
18:46:30.873 INFO MemoryStore - Block broadcast_239 stored as values in memory (estimated size 160.7 KiB, free 1915.9 MiB)
18:46:30.874 INFO MemoryStore - Block broadcast_239_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.8 MiB)
18:46:30.874 INFO BlockManagerInfo - Added broadcast_239_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.2 MiB)
18:46:30.874 INFO SparkContext - Created broadcast 239 from broadcast at ReadsSparkSink.java:133
18:46:30.874 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:30.875 INFO MemoryStore - Block broadcast_240 stored as values in memory (estimated size 163.2 KiB, free 1915.7 MiB)
18:46:30.876 INFO MemoryStore - Block broadcast_240_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.7 MiB)
18:46:30.876 INFO BlockManagerInfo - Added broadcast_240_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.2 MiB)
18:46:30.876 INFO SparkContext - Created broadcast 240 from broadcast at BamSink.java:76
18:46:30.878 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.878 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.878 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.894 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:30.894 INFO DAGScheduler - Registering RDD 593 (mapToPair at SparkUtils.java:161) as input to shuffle 27
18:46:30.895 INFO DAGScheduler - Got job 91 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:30.895 INFO DAGScheduler - Final stage: ResultStage 130 (runJob at SparkHadoopWriter.scala:83)
18:46:30.895 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 129)
18:46:30.895 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 129)
18:46:30.895 INFO DAGScheduler - Submitting ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:30.915 INFO MemoryStore - Block broadcast_241 stored as values in memory (estimated size 520.4 KiB, free 1915.2 MiB)
18:46:30.922 INFO BlockManagerInfo - Removed broadcast_236_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.3 MiB)
18:46:30.923 INFO MemoryStore - Block broadcast_241_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.2 MiB)
18:46:30.923 INFO BlockManagerInfo - Removed broadcast_230_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.4 MiB)
18:46:30.923 INFO BlockManagerInfo - Added broadcast_241_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.3 MiB)
18:46:30.923 INFO SparkContext - Created broadcast 241 from broadcast at DAGScheduler.scala:1580
18:46:30.924 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:30.924 INFO TaskSchedulerImpl - Adding task set 129.0 with 1 tasks resource profile 0
18:46:30.924 INFO BlockManagerInfo - Removed broadcast_235_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:30.925 INFO TaskSetManager - Starting task 0.0 in stage 129.0 (TID 185) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:30.925 INFO BlockManagerInfo - Removed broadcast_238_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:30.925 INFO Executor - Running task 0.0 in stage 129.0 (TID 185)
18:46:30.925 INFO BlockManagerInfo - Removed broadcast_227_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:30.926 INFO BlockManagerInfo - Removed broadcast_234_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.6 MiB)
18:46:30.927 INFO BlockManagerInfo - Removed broadcast_229_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:30.928 INFO BlockManagerInfo - Removed broadcast_231_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.7 MiB)
18:46:30.928 INFO BlockManagerInfo - Removed broadcast_233_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:30.929 INFO BlockManagerInfo - Removed broadcast_226_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:30.929 INFO BlockManagerInfo - Removed broadcast_228_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:30.930 INFO BlockManagerInfo - Removed broadcast_232_piece0 on localhost:45727 in memory (size: 8.3 KiB, free: 1919.8 MiB)
18:46:30.959 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:30.974 INFO Executor - Finished task 0.0 in stage 129.0 (TID 185). 1148 bytes result sent to driver
18:46:30.975 INFO TaskSetManager - Finished task 0.0 in stage 129.0 (TID 185) in 50 ms on localhost (executor driver) (1/1)
18:46:30.975 INFO TaskSchedulerImpl - Removed TaskSet 129.0, whose tasks have all completed, from pool
18:46:30.975 INFO DAGScheduler - ShuffleMapStage 129 (mapToPair at SparkUtils.java:161) finished in 0.080 s
18:46:30.975 INFO DAGScheduler - looking for newly runnable stages
18:46:30.975 INFO DAGScheduler - running: HashSet()
18:46:30.975 INFO DAGScheduler - waiting: HashSet(ResultStage 130)
18:46:30.975 INFO DAGScheduler - failed: HashSet()
18:46:30.975 INFO DAGScheduler - Submitting ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91), which has no missing parents
18:46:30.982 INFO MemoryStore - Block broadcast_242 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
18:46:30.983 INFO MemoryStore - Block broadcast_242_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
18:46:30.983 INFO BlockManagerInfo - Added broadcast_242_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.7 MiB)
18:46:30.983 INFO SparkContext - Created broadcast 242 from broadcast at DAGScheduler.scala:1580
18:46:30.983 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:30.983 INFO TaskSchedulerImpl - Adding task set 130.0 with 1 tasks resource profile 0
18:46:30.984 INFO TaskSetManager - Starting task 0.0 in stage 130.0 (TID 186) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:30.984 INFO Executor - Running task 0.0 in stage 130.0 (TID 186)
18:46:30.988 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:30.988 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:30.999 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.999 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.999 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:30.999 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:30.999 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:30.999 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:31.020 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846303612084647918536807_0598_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest19727986388591723413.bam.parts/_temporary/0/task_202505191846303612084647918536807_0598_r_000000
18:46:31.020 INFO SparkHadoopMapRedUtil - attempt_202505191846303612084647918536807_0598_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:31.021 INFO Executor - Finished task 0.0 in stage 130.0 (TID 186). 1858 bytes result sent to driver
18:46:31.021 INFO TaskSetManager - Finished task 0.0 in stage 130.0 (TID 186) in 38 ms on localhost (executor driver) (1/1)
18:46:31.021 INFO TaskSchedulerImpl - Removed TaskSet 130.0, whose tasks have all completed, from pool
18:46:31.022 INFO DAGScheduler - ResultStage 130 (runJob at SparkHadoopWriter.scala:83) finished in 0.047 s
18:46:31.022 INFO DAGScheduler - Job 91 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.022 INFO TaskSchedulerImpl - Killing all running tasks in stage 130: Stage finished
18:46:31.022 INFO DAGScheduler - Job 91 finished: runJob at SparkHadoopWriter.scala:83, took 0.127979 s
18:46:31.022 INFO SparkHadoopWriter - Start to commit write Job job_202505191846303612084647918536807_0598.
18:46:31.028 INFO SparkHadoopWriter - Write Job job_202505191846303612084647918536807_0598 committed. Elapsed time: 5 ms.
18:46:31.039 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest19727986388591723413.bam
18:46:31.043 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest19727986388591723413.bam done
18:46:31.043 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest19727986388591723413.bam.parts/ to /tmp/ReadsSparkSinkUnitTest19727986388591723413.bam.bai
18:46:31.048 INFO IndexFileMerger - Done merging .bai files
18:46:31.051 INFO MemoryStore - Block broadcast_243 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:31.057 INFO MemoryStore - Block broadcast_243_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:31.058 INFO BlockManagerInfo - Added broadcast_243_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:31.058 INFO SparkContext - Created broadcast 243 from newAPIHadoopFile at PathSplitSource.java:96
18:46:31.078 INFO FileInputFormat - Total input files to process : 1
18:46:31.113 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:31.113 INFO DAGScheduler - Got job 92 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:31.113 INFO DAGScheduler - Final stage: ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:31.113 INFO DAGScheduler - Parents of final stage: List()
18:46:31.113 INFO DAGScheduler - Missing parents: List()
18:46:31.114 INFO DAGScheduler - Submitting ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:31.132 INFO MemoryStore - Block broadcast_244 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
18:46:31.134 INFO MemoryStore - Block broadcast_244_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
18:46:31.134 INFO BlockManagerInfo - Added broadcast_244_piece0 in memory on localhost:45727 (size: 153.7 KiB, free: 1919.5 MiB)
18:46:31.134 INFO SparkContext - Created broadcast 244 from broadcast at DAGScheduler.scala:1580
18:46:31.134 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:31.134 INFO TaskSchedulerImpl - Adding task set 131.0 with 1 tasks resource profile 0
18:46:31.135 INFO TaskSetManager - Starting task 0.0 in stage 131.0 (TID 187) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:31.135 INFO Executor - Running task 0.0 in stage 131.0 (TID 187)
18:46:31.165 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19727986388591723413.bam:0+237038
18:46:31.177 INFO Executor - Finished task 0.0 in stage 131.0 (TID 187). 651483 bytes result sent to driver
18:46:31.179 INFO TaskSetManager - Finished task 0.0 in stage 131.0 (TID 187) in 44 ms on localhost (executor driver) (1/1)
18:46:31.179 INFO TaskSchedulerImpl - Removed TaskSet 131.0, whose tasks have all completed, from pool
18:46:31.179 INFO DAGScheduler - ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.065 s
18:46:31.179 INFO DAGScheduler - Job 92 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.179 INFO TaskSchedulerImpl - Killing all running tasks in stage 131: Stage finished
18:46:31.179 INFO DAGScheduler - Job 92 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.066294 s
18:46:31.189 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:31.189 INFO DAGScheduler - Got job 93 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:31.189 INFO DAGScheduler - Final stage: ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185)
18:46:31.189 INFO DAGScheduler - Parents of final stage: List()
18:46:31.189 INFO DAGScheduler - Missing parents: List()
18:46:31.189 INFO DAGScheduler - Submitting ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:31.208 INFO MemoryStore - Block broadcast_245 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
18:46:31.209 INFO MemoryStore - Block broadcast_245_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
18:46:31.210 INFO BlockManagerInfo - Added broadcast_245_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:31.210 INFO SparkContext - Created broadcast 245 from broadcast at DAGScheduler.scala:1580
18:46:31.210 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:31.210 INFO TaskSchedulerImpl - Adding task set 132.0 with 1 tasks resource profile 0
18:46:31.210 INFO TaskSetManager - Starting task 0.0 in stage 132.0 (TID 188) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:31.211 INFO Executor - Running task 0.0 in stage 132.0 (TID 188)
18:46:31.239 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:31.248 INFO Executor - Finished task 0.0 in stage 132.0 (TID 188). 989 bytes result sent to driver
18:46:31.249 INFO TaskSetManager - Finished task 0.0 in stage 132.0 (TID 188) in 39 ms on localhost (executor driver) (1/1)
18:46:31.249 INFO TaskSchedulerImpl - Removed TaskSet 132.0, whose tasks have all completed, from pool
18:46:31.249 INFO DAGScheduler - ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
18:46:31.249 INFO DAGScheduler - Job 93 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.249 INFO TaskSchedulerImpl - Killing all running tasks in stage 132: Stage finished
18:46:31.249 INFO DAGScheduler - Job 93 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060324 s
18:46:31.253 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:31.253 INFO DAGScheduler - Got job 94 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:31.253 INFO DAGScheduler - Final stage: ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185)
18:46:31.253 INFO DAGScheduler - Parents of final stage: List()
18:46:31.253 INFO DAGScheduler - Missing parents: List()
18:46:31.253 INFO DAGScheduler - Submitting ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:31.273 INFO MemoryStore - Block broadcast_246 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
18:46:31.275 INFO MemoryStore - Block broadcast_246_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
18:46:31.275 INFO BlockManagerInfo - Added broadcast_246_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.2 MiB)
18:46:31.275 INFO SparkContext - Created broadcast 246 from broadcast at DAGScheduler.scala:1580
18:46:31.275 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:31.275 INFO TaskSchedulerImpl - Adding task set 133.0 with 1 tasks resource profile 0
18:46:31.276 INFO TaskSetManager - Starting task 0.0 in stage 133.0 (TID 189) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:31.276 INFO Executor - Running task 0.0 in stage 133.0 (TID 189)
18:46:31.305 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19727986388591723413.bam:0+237038
18:46:31.316 INFO Executor - Finished task 0.0 in stage 133.0 (TID 189). 989 bytes result sent to driver
18:46:31.316 INFO TaskSetManager - Finished task 0.0 in stage 133.0 (TID 189) in 40 ms on localhost (executor driver) (1/1)
18:46:31.316 INFO TaskSchedulerImpl - Removed TaskSet 133.0, whose tasks have all completed, from pool
18:46:31.316 INFO DAGScheduler - ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
18:46:31.317 INFO DAGScheduler - Job 94 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.317 INFO TaskSchedulerImpl - Killing all running tasks in stage 133: Stage finished
18:46:31.317 INFO DAGScheduler - Job 94 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063940 s
18:46:31.319 INFO MemoryStore - Block broadcast_247 stored as values in memory (estimated size 297.9 KiB, free 1916.0 MiB)
18:46:31.325 INFO MemoryStore - Block broadcast_247_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
18:46:31.325 INFO BlockManagerInfo - Added broadcast_247_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:31.326 INFO SparkContext - Created broadcast 247 from newAPIHadoopFile at PathSplitSource.java:96
18:46:31.347 INFO MemoryStore - Block broadcast_248 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
18:46:31.353 INFO MemoryStore - Block broadcast_248_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
18:46:31.353 INFO BlockManagerInfo - Added broadcast_248_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.1 MiB)
18:46:31.353 INFO SparkContext - Created broadcast 248 from newAPIHadoopFile at PathSplitSource.java:96
18:46:31.373 INFO FileInputFormat - Total input files to process : 1
18:46:31.374 INFO MemoryStore - Block broadcast_249 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
18:46:31.375 INFO MemoryStore - Block broadcast_249_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
18:46:31.375 INFO BlockManagerInfo - Added broadcast_249_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.1 MiB)
18:46:31.375 INFO SparkContext - Created broadcast 249 from broadcast at ReadsSparkSink.java:133
18:46:31.376 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:31.377 INFO MemoryStore - Block broadcast_250 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
18:46:31.377 INFO MemoryStore - Block broadcast_250_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
18:46:31.377 INFO BlockManagerInfo - Added broadcast_250_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.1 MiB)
18:46:31.378 INFO SparkContext - Created broadcast 250 from broadcast at BamSink.java:76
18:46:31.379 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:31.379 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:31.379 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:31.395 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:31.396 INFO DAGScheduler - Registering RDD 619 (mapToPair at SparkUtils.java:161) as input to shuffle 28
18:46:31.396 INFO DAGScheduler - Got job 95 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:31.396 INFO DAGScheduler - Final stage: ResultStage 135 (runJob at SparkHadoopWriter.scala:83)
18:46:31.396 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 134)
18:46:31.396 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 134)
18:46:31.396 INFO DAGScheduler - Submitting ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:31.415 INFO MemoryStore - Block broadcast_251 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
18:46:31.417 INFO MemoryStore - Block broadcast_251_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
18:46:31.417 INFO BlockManagerInfo - Added broadcast_251_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1918.9 MiB)
18:46:31.417 INFO SparkContext - Created broadcast 251 from broadcast at DAGScheduler.scala:1580
18:46:31.417 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:31.417 INFO TaskSchedulerImpl - Adding task set 134.0 with 1 tasks resource profile 0
18:46:31.418 INFO TaskSetManager - Starting task 0.0 in stage 134.0 (TID 190) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:31.418 INFO Executor - Running task 0.0 in stage 134.0 (TID 190)
18:46:31.448 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:31.463 INFO Executor - Finished task 0.0 in stage 134.0 (TID 190). 1148 bytes result sent to driver
18:46:31.463 INFO TaskSetManager - Finished task 0.0 in stage 134.0 (TID 190) in 45 ms on localhost (executor driver) (1/1)
18:46:31.463 INFO TaskSchedulerImpl - Removed TaskSet 134.0, whose tasks have all completed, from pool
18:46:31.463 INFO DAGScheduler - ShuffleMapStage 134 (mapToPair at SparkUtils.java:161) finished in 0.067 s
18:46:31.463 INFO DAGScheduler - looking for newly runnable stages
18:46:31.463 INFO DAGScheduler - running: HashSet()
18:46:31.463 INFO DAGScheduler - waiting: HashSet(ResultStage 135)
18:46:31.463 INFO DAGScheduler - failed: HashSet()
18:46:31.464 INFO DAGScheduler - Submitting ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91), which has no missing parents
18:46:31.470 INFO MemoryStore - Block broadcast_252 stored as values in memory (estimated size 241.4 KiB, free 1914.4 MiB)
18:46:31.471 INFO MemoryStore - Block broadcast_252_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1914.3 MiB)
18:46:31.471 INFO BlockManagerInfo - Added broadcast_252_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1918.9 MiB)
18:46:31.471 INFO SparkContext - Created broadcast 252 from broadcast at DAGScheduler.scala:1580
18:46:31.472 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:31.472 INFO TaskSchedulerImpl - Adding task set 135.0 with 1 tasks resource profile 0
18:46:31.472 INFO TaskSetManager - Starting task 0.0 in stage 135.0 (TID 191) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:31.472 INFO Executor - Running task 0.0 in stage 135.0 (TID 191)
18:46:31.477 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:31.477 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:31.488 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:31.488 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:31.488 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:31.488 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:31.488 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:31.488 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:31.497 INFO BlockManagerInfo - Removed broadcast_248_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1918.9 MiB)
18:46:31.498 INFO BlockManagerInfo - Removed broadcast_241_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.1 MiB)
18:46:31.499 INFO BlockManagerInfo - Removed broadcast_245_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.2 MiB)
18:46:31.500 INFO BlockManagerInfo - Removed broadcast_240_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.2 MiB)
18:46:31.500 INFO BlockManagerInfo - Removed broadcast_243_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:31.501 INFO BlockManagerInfo - Removed broadcast_251_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.4 MiB)
18:46:31.502 INFO BlockManagerInfo - Removed broadcast_246_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.6 MiB)
18:46:31.502 INFO BlockManagerInfo - Removed broadcast_237_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:31.503 INFO BlockManagerInfo - Removed broadcast_239_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:31.503 INFO BlockManagerInfo - Removed broadcast_242_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.7 MiB)
18:46:31.504 INFO BlockManagerInfo - Removed broadcast_244_piece0 on localhost:45727 in memory (size: 153.7 KiB, free: 1919.9 MiB)
18:46:31.514 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846316874983525237990180_0624_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest18049089034213526665.bam.parts/_temporary/0/task_202505191846316874983525237990180_0624_r_000000
18:46:31.514 INFO SparkHadoopMapRedUtil - attempt_202505191846316874983525237990180_0624_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:31.514 INFO Executor - Finished task 0.0 in stage 135.0 (TID 191). 1901 bytes result sent to driver
18:46:31.515 INFO TaskSetManager - Finished task 0.0 in stage 135.0 (TID 191) in 43 ms on localhost (executor driver) (1/1)
18:46:31.515 INFO TaskSchedulerImpl - Removed TaskSet 135.0, whose tasks have all completed, from pool
18:46:31.515 INFO DAGScheduler - ResultStage 135 (runJob at SparkHadoopWriter.scala:83) finished in 0.051 s
18:46:31.515 INFO DAGScheduler - Job 95 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.515 INFO TaskSchedulerImpl - Killing all running tasks in stage 135: Stage finished
18:46:31.515 INFO DAGScheduler - Job 95 finished: runJob at SparkHadoopWriter.scala:83, took 0.120075 s
18:46:31.516 INFO SparkHadoopWriter - Start to commit write Job job_202505191846316874983525237990180_0624.
18:46:31.521 INFO SparkHadoopWriter - Write Job job_202505191846316874983525237990180_0624 committed. Elapsed time: 5 ms.
18:46:31.534 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest18049089034213526665.bam
18:46:31.539 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest18049089034213526665.bam done
18:46:31.539 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest18049089034213526665.bam.parts/ to /tmp/ReadsSparkSinkUnitTest18049089034213526665.bam.sbi
18:46:31.544 INFO IndexFileMerger - Done merging .sbi files
18:46:31.545 INFO MemoryStore - Block broadcast_253 stored as values in memory (estimated size 320.0 B, free 1919.0 MiB)
18:46:31.546 INFO MemoryStore - Block broadcast_253_piece0 stored as bytes in memory (estimated size 233.0 B, free 1919.0 MiB)
18:46:31.546 INFO BlockManagerInfo - Added broadcast_253_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.9 MiB)
18:46:31.546 INFO SparkContext - Created broadcast 253 from broadcast at BamSource.java:104
18:46:31.547 INFO MemoryStore - Block broadcast_254 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
18:46:31.554 INFO MemoryStore - Block broadcast_254_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
18:46:31.554 INFO BlockManagerInfo - Added broadcast_254_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:31.555 INFO SparkContext - Created broadcast 254 from newAPIHadoopFile at PathSplitSource.java:96
18:46:31.563 INFO FileInputFormat - Total input files to process : 1
18:46:31.577 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:31.577 INFO DAGScheduler - Got job 96 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:31.577 INFO DAGScheduler - Final stage: ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:31.577 INFO DAGScheduler - Parents of final stage: List()
18:46:31.577 INFO DAGScheduler - Missing parents: List()
18:46:31.577 INFO DAGScheduler - Submitting ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:31.583 INFO MemoryStore - Block broadcast_255 stored as values in memory (estimated size 148.2 KiB, free 1918.5 MiB)
18:46:31.584 INFO MemoryStore - Block broadcast_255_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.5 MiB)
18:46:31.584 INFO BlockManagerInfo - Added broadcast_255_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.8 MiB)
18:46:31.585 INFO SparkContext - Created broadcast 255 from broadcast at DAGScheduler.scala:1580
18:46:31.585 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:31.585 INFO TaskSchedulerImpl - Adding task set 136.0 with 1 tasks resource profile 0
18:46:31.585 INFO TaskSetManager - Starting task 0.0 in stage 136.0 (TID 192) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:31.586 INFO Executor - Running task 0.0 in stage 136.0 (TID 192)
18:46:31.598 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest18049089034213526665.bam:0+237038
18:46:31.602 INFO Executor - Finished task 0.0 in stage 136.0 (TID 192). 651483 bytes result sent to driver
18:46:31.603 INFO TaskSetManager - Finished task 0.0 in stage 136.0 (TID 192) in 18 ms on localhost (executor driver) (1/1)
18:46:31.603 INFO TaskSchedulerImpl - Removed TaskSet 136.0, whose tasks have all completed, from pool
18:46:31.603 INFO DAGScheduler - ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
18:46:31.603 INFO DAGScheduler - Job 96 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.603 INFO TaskSchedulerImpl - Killing all running tasks in stage 136: Stage finished
18:46:31.604 INFO DAGScheduler - Job 96 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026512 s
18:46:31.613 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:31.613 INFO DAGScheduler - Got job 97 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:31.613 INFO DAGScheduler - Final stage: ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185)
18:46:31.613 INFO DAGScheduler - Parents of final stage: List()
18:46:31.613 INFO DAGScheduler - Missing parents: List()
18:46:31.613 INFO DAGScheduler - Submitting ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:31.630 INFO MemoryStore - Block broadcast_256 stored as values in memory (estimated size 426.1 KiB, free 1918.1 MiB)
18:46:31.632 INFO MemoryStore - Block broadcast_256_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.9 MiB)
18:46:31.632 INFO BlockManagerInfo - Added broadcast_256_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.6 MiB)
18:46:31.632 INFO SparkContext - Created broadcast 256 from broadcast at DAGScheduler.scala:1580
18:46:31.632 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:31.632 INFO TaskSchedulerImpl - Adding task set 137.0 with 1 tasks resource profile 0
18:46:31.633 INFO TaskSetManager - Starting task 0.0 in stage 137.0 (TID 193) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:31.633 INFO Executor - Running task 0.0 in stage 137.0 (TID 193)
18:46:31.663 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:31.672 INFO Executor - Finished task 0.0 in stage 137.0 (TID 193). 989 bytes result sent to driver
18:46:31.673 INFO TaskSetManager - Finished task 0.0 in stage 137.0 (TID 193) in 40 ms on localhost (executor driver) (1/1)
18:46:31.673 INFO TaskSchedulerImpl - Removed TaskSet 137.0, whose tasks have all completed, from pool
18:46:31.673 INFO DAGScheduler - ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
18:46:31.673 INFO DAGScheduler - Job 97 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.673 INFO TaskSchedulerImpl - Killing all running tasks in stage 137: Stage finished
18:46:31.673 INFO DAGScheduler - Job 97 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060398 s
18:46:31.677 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:31.677 INFO DAGScheduler - Got job 98 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:31.677 INFO DAGScheduler - Final stage: ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185)
18:46:31.677 INFO DAGScheduler - Parents of final stage: List()
18:46:31.677 INFO DAGScheduler - Missing parents: List()
18:46:31.677 INFO DAGScheduler - Submitting ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:31.684 INFO MemoryStore - Block broadcast_257 stored as values in memory (estimated size 148.1 KiB, free 1917.8 MiB)
18:46:31.685 INFO MemoryStore - Block broadcast_257_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.7 MiB)
18:46:31.685 INFO BlockManagerInfo - Added broadcast_257_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.6 MiB)
18:46:31.685 INFO SparkContext - Created broadcast 257 from broadcast at DAGScheduler.scala:1580
18:46:31.685 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:31.685 INFO TaskSchedulerImpl - Adding task set 138.0 with 1 tasks resource profile 0
18:46:31.686 INFO TaskSetManager - Starting task 0.0 in stage 138.0 (TID 194) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:31.686 INFO Executor - Running task 0.0 in stage 138.0 (TID 194)
18:46:31.697 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest18049089034213526665.bam:0+237038
18:46:31.700 INFO Executor - Finished task 0.0 in stage 138.0 (TID 194). 989 bytes result sent to driver
18:46:31.701 INFO TaskSetManager - Finished task 0.0 in stage 138.0 (TID 194) in 15 ms on localhost (executor driver) (1/1)
18:46:31.701 INFO TaskSchedulerImpl - Removed TaskSet 138.0, whose tasks have all completed, from pool
18:46:31.701 INFO DAGScheduler - ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.024 s
18:46:31.701 INFO DAGScheduler - Job 98 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.701 INFO TaskSchedulerImpl - Killing all running tasks in stage 138: Stage finished
18:46:31.701 INFO DAGScheduler - Job 98 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.024442 s
18:46:31.704 INFO MemoryStore - Block broadcast_258 stored as values in memory (estimated size 297.9 KiB, free 1917.4 MiB)
18:46:31.710 INFO MemoryStore - Block broadcast_258_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.4 MiB)
18:46:31.710 INFO BlockManagerInfo - Added broadcast_258_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:31.710 INFO SparkContext - Created broadcast 258 from newAPIHadoopFile at PathSplitSource.java:96
18:46:31.731 INFO MemoryStore - Block broadcast_259 stored as values in memory (estimated size 297.9 KiB, free 1917.1 MiB)
18:46:31.738 INFO MemoryStore - Block broadcast_259_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
18:46:31.738 INFO BlockManagerInfo - Added broadcast_259_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:31.738 INFO SparkContext - Created broadcast 259 from newAPIHadoopFile at PathSplitSource.java:96
18:46:31.757 INFO FileInputFormat - Total input files to process : 1
18:46:31.759 INFO MemoryStore - Block broadcast_260 stored as values in memory (estimated size 160.7 KiB, free 1916.9 MiB)
18:46:31.759 INFO MemoryStore - Block broadcast_260_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.9 MiB)
18:46:31.760 INFO BlockManagerInfo - Added broadcast_260_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:31.760 INFO SparkContext - Created broadcast 260 from broadcast at ReadsSparkSink.java:133
18:46:31.760 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:31.760 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:31.761 INFO MemoryStore - Block broadcast_261 stored as values in memory (estimated size 163.2 KiB, free 1916.7 MiB)
18:46:31.762 INFO MemoryStore - Block broadcast_261_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.7 MiB)
18:46:31.762 INFO BlockManagerInfo - Added broadcast_261_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.4 MiB)
18:46:31.762 INFO SparkContext - Created broadcast 261 from broadcast at BamSink.java:76
18:46:31.764 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:31.764 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:31.764 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:31.780 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:31.781 INFO DAGScheduler - Registering RDD 644 (mapToPair at SparkUtils.java:161) as input to shuffle 29
18:46:31.781 INFO DAGScheduler - Got job 99 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:31.781 INFO DAGScheduler - Final stage: ResultStage 140 (runJob at SparkHadoopWriter.scala:83)
18:46:31.781 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 139)
18:46:31.781 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 139)
18:46:31.781 INFO DAGScheduler - Submitting ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:31.798 INFO MemoryStore - Block broadcast_262 stored as values in memory (estimated size 520.4 KiB, free 1916.2 MiB)
18:46:31.800 INFO MemoryStore - Block broadcast_262_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
18:46:31.800 INFO BlockManagerInfo - Added broadcast_262_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.3 MiB)
18:46:31.800 INFO SparkContext - Created broadcast 262 from broadcast at DAGScheduler.scala:1580
18:46:31.800 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:31.800 INFO TaskSchedulerImpl - Adding task set 139.0 with 1 tasks resource profile 0
18:46:31.801 INFO TaskSetManager - Starting task 0.0 in stage 139.0 (TID 195) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:31.801 INFO Executor - Running task 0.0 in stage 139.0 (TID 195)
18:46:31.832 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:31.847 INFO Executor - Finished task 0.0 in stage 139.0 (TID 195). 1148 bytes result sent to driver
18:46:31.848 INFO TaskSetManager - Finished task 0.0 in stage 139.0 (TID 195) in 47 ms on localhost (executor driver) (1/1)
18:46:31.848 INFO TaskSchedulerImpl - Removed TaskSet 139.0, whose tasks have all completed, from pool
18:46:31.848 INFO DAGScheduler - ShuffleMapStage 139 (mapToPair at SparkUtils.java:161) finished in 0.067 s
18:46:31.848 INFO DAGScheduler - looking for newly runnable stages
18:46:31.848 INFO DAGScheduler - running: HashSet()
18:46:31.848 INFO DAGScheduler - waiting: HashSet(ResultStage 140)
18:46:31.848 INFO DAGScheduler - failed: HashSet()
18:46:31.848 INFO DAGScheduler - Submitting ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91), which has no missing parents
18:46:31.855 INFO MemoryStore - Block broadcast_263 stored as values in memory (estimated size 241.4 KiB, free 1915.8 MiB)
18:46:31.856 INFO MemoryStore - Block broadcast_263_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.7 MiB)
18:46:31.856 INFO BlockManagerInfo - Added broadcast_263_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.2 MiB)
18:46:31.856 INFO SparkContext - Created broadcast 263 from broadcast at DAGScheduler.scala:1580
18:46:31.856 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:31.856 INFO TaskSchedulerImpl - Adding task set 140.0 with 1 tasks resource profile 0
18:46:31.857 INFO TaskSetManager - Starting task 0.0 in stage 140.0 (TID 196) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:31.857 INFO Executor - Running task 0.0 in stage 140.0 (TID 196)
18:46:31.861 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:31.861 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:31.872 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:31.872 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:31.872 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:31.873 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:31.873 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:31.873 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:31.887 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846316030445757188684601_0649_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest17312506348861175590.bam.parts/_temporary/0/task_202505191846316030445757188684601_0649_r_000000
18:46:31.887 INFO SparkHadoopMapRedUtil - attempt_202505191846316030445757188684601_0649_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:31.888 INFO Executor - Finished task 0.0 in stage 140.0 (TID 196). 1858 bytes result sent to driver
18:46:31.888 INFO TaskSetManager - Finished task 0.0 in stage 140.0 (TID 196) in 31 ms on localhost (executor driver) (1/1)
18:46:31.888 INFO TaskSchedulerImpl - Removed TaskSet 140.0, whose tasks have all completed, from pool
18:46:31.888 INFO DAGScheduler - ResultStage 140 (runJob at SparkHadoopWriter.scala:83) finished in 0.040 s
18:46:31.888 INFO DAGScheduler - Job 99 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:31.888 INFO TaskSchedulerImpl - Killing all running tasks in stage 140: Stage finished
18:46:31.888 INFO DAGScheduler - Job 99 finished: runJob at SparkHadoopWriter.scala:83, took 0.108144 s
18:46:31.889 INFO SparkHadoopWriter - Start to commit write Job job_202505191846316030445757188684601_0649.
18:46:31.893 INFO SparkHadoopWriter - Write Job job_202505191846316030445757188684601_0649 committed. Elapsed time: 4 ms.
18:46:31.903 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest17312506348861175590.bam
18:46:31.907 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest17312506348861175590.bam done
18:46:31.910 INFO MemoryStore - Block broadcast_264 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
18:46:31.916 INFO MemoryStore - Block broadcast_264_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.4 MiB)
18:46:31.916 INFO BlockManagerInfo - Added broadcast_264_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:31.917 INFO SparkContext - Created broadcast 264 from newAPIHadoopFile at PathSplitSource.java:96
18:46:31.936 INFO FileInputFormat - Total input files to process : 1
18:46:31.970 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:31.971 INFO DAGScheduler - Got job 100 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:31.971 INFO DAGScheduler - Final stage: ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:31.971 INFO DAGScheduler - Parents of final stage: List()
18:46:31.971 INFO DAGScheduler - Missing parents: List()
18:46:31.971 INFO DAGScheduler - Submitting ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:31.988 INFO MemoryStore - Block broadcast_265 stored as values in memory (estimated size 426.2 KiB, free 1915.0 MiB)
18:46:31.989 INFO MemoryStore - Block broadcast_265_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1914.8 MiB)
18:46:31.989 INFO BlockManagerInfo - Added broadcast_265_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.0 MiB)
18:46:31.989 INFO SparkContext - Created broadcast 265 from broadcast at DAGScheduler.scala:1580
18:46:31.990 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:31.990 INFO TaskSchedulerImpl - Adding task set 141.0 with 1 tasks resource profile 0
18:46:31.990 INFO TaskSetManager - Starting task 0.0 in stage 141.0 (TID 197) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:31.990 INFO Executor - Running task 0.0 in stage 141.0 (TID 197)
18:46:32.019 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17312506348861175590.bam:0+237038
18:46:32.031 INFO Executor - Finished task 0.0 in stage 141.0 (TID 197). 651526 bytes result sent to driver
18:46:32.033 INFO TaskSetManager - Finished task 0.0 in stage 141.0 (TID 197) in 43 ms on localhost (executor driver) (1/1)
18:46:32.033 INFO TaskSchedulerImpl - Removed TaskSet 141.0, whose tasks have all completed, from pool
18:46:32.033 INFO DAGScheduler - ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.062 s
18:46:32.033 INFO DAGScheduler - Job 100 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.033 INFO TaskSchedulerImpl - Killing all running tasks in stage 141: Stage finished
18:46:32.033 INFO DAGScheduler - Job 100 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.062703 s
18:46:32.049 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:32.049 INFO DAGScheduler - Got job 101 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:32.049 INFO DAGScheduler - Final stage: ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185)
18:46:32.049 INFO DAGScheduler - Parents of final stage: List()
18:46:32.049 INFO DAGScheduler - Missing parents: List()
18:46:32.049 INFO DAGScheduler - Submitting ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.069 INFO MemoryStore - Block broadcast_266 stored as values in memory (estimated size 426.1 KiB, free 1914.4 MiB)
18:46:32.075 INFO BlockManagerInfo - Removed broadcast_250_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.0 MiB)
18:46:32.076 INFO BlockManagerInfo - Removed broadcast_255_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.1 MiB)
18:46:32.076 INFO MemoryStore - Block broadcast_266_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1914.6 MiB)
18:46:32.076 INFO BlockManagerInfo - Added broadcast_266_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1918.9 MiB)
18:46:32.077 INFO SparkContext - Created broadcast 266 from broadcast at DAGScheduler.scala:1580
18:46:32.077 INFO BlockManagerInfo - Removed broadcast_262_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.1 MiB)
18:46:32.077 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.077 INFO TaskSchedulerImpl - Adding task set 142.0 with 1 tasks resource profile 0
18:46:32.077 INFO TaskSetManager - Starting task 0.0 in stage 142.0 (TID 198) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:32.078 INFO Executor - Running task 0.0 in stage 142.0 (TID 198)
18:46:32.078 INFO BlockManagerInfo - Removed broadcast_261_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.1 MiB)
18:46:32.078 INFO BlockManagerInfo - Removed broadcast_252_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.2 MiB)
18:46:32.079 INFO BlockManagerInfo - Removed broadcast_265_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:32.080 INFO BlockManagerInfo - Removed broadcast_257_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.4 MiB)
18:46:32.081 INFO BlockManagerInfo - Removed broadcast_256_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:32.082 INFO BlockManagerInfo - Removed broadcast_263_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.6 MiB)
18:46:32.082 INFO BlockManagerInfo - Removed broadcast_260_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:32.083 INFO BlockManagerInfo - Removed broadcast_259_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:32.084 INFO BlockManagerInfo - Removed broadcast_253_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.6 MiB)
18:46:32.084 INFO BlockManagerInfo - Removed broadcast_247_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:32.085 INFO BlockManagerInfo - Removed broadcast_254_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:32.085 INFO BlockManagerInfo - Removed broadcast_249_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:32.109 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:32.118 INFO Executor - Finished task 0.0 in stage 142.0 (TID 198). 989 bytes result sent to driver
18:46:32.118 INFO TaskSetManager - Finished task 0.0 in stage 142.0 (TID 198) in 41 ms on localhost (executor driver) (1/1)
18:46:32.118 INFO TaskSchedulerImpl - Removed TaskSet 142.0, whose tasks have all completed, from pool
18:46:32.118 INFO DAGScheduler - ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.068 s
18:46:32.119 INFO DAGScheduler - Job 101 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.119 INFO TaskSchedulerImpl - Killing all running tasks in stage 142: Stage finished
18:46:32.119 INFO DAGScheduler - Job 101 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.069727 s
18:46:32.122 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:32.123 INFO DAGScheduler - Got job 102 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:32.123 INFO DAGScheduler - Final stage: ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185)
18:46:32.123 INFO DAGScheduler - Parents of final stage: List()
18:46:32.123 INFO DAGScheduler - Missing parents: List()
18:46:32.123 INFO DAGScheduler - Submitting ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.140 INFO MemoryStore - Block broadcast_267 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
18:46:32.141 INFO MemoryStore - Block broadcast_267_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
18:46:32.141 INFO BlockManagerInfo - Added broadcast_267_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.6 MiB)
18:46:32.141 INFO SparkContext - Created broadcast 267 from broadcast at DAGScheduler.scala:1580
18:46:32.141 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.141 INFO TaskSchedulerImpl - Adding task set 143.0 with 1 tasks resource profile 0
18:46:32.142 INFO TaskSetManager - Starting task 0.0 in stage 143.0 (TID 199) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:32.142 INFO Executor - Running task 0.0 in stage 143.0 (TID 199)
18:46:32.172 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17312506348861175590.bam:0+237038
18:46:32.183 INFO Executor - Finished task 0.0 in stage 143.0 (TID 199). 989 bytes result sent to driver
18:46:32.184 INFO TaskSetManager - Finished task 0.0 in stage 143.0 (TID 199) in 42 ms on localhost (executor driver) (1/1)
18:46:32.184 INFO TaskSchedulerImpl - Removed TaskSet 143.0, whose tasks have all completed, from pool
18:46:32.184 INFO DAGScheduler - ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.061 s
18:46:32.184 INFO DAGScheduler - Job 102 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.184 INFO TaskSchedulerImpl - Killing all running tasks in stage 143: Stage finished
18:46:32.184 INFO DAGScheduler - Job 102 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.061761 s
18:46:32.187 INFO MemoryStore - Block broadcast_268 stored as values in memory (estimated size 298.0 KiB, free 1917.9 MiB)
18:46:32.193 INFO MemoryStore - Block broadcast_268_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.8 MiB)
18:46:32.193 INFO BlockManagerInfo - Added broadcast_268_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.6 MiB)
18:46:32.193 INFO SparkContext - Created broadcast 268 from newAPIHadoopFile at PathSplitSource.java:96
18:46:32.215 INFO MemoryStore - Block broadcast_269 stored as values in memory (estimated size 298.0 KiB, free 1917.6 MiB)
18:46:32.221 INFO MemoryStore - Block broadcast_269_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.5 MiB)
18:46:32.221 INFO BlockManagerInfo - Added broadcast_269_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.5 MiB)
18:46:32.221 INFO SparkContext - Created broadcast 269 from newAPIHadoopFile at PathSplitSource.java:96
18:46:32.240 INFO FileInputFormat - Total input files to process : 1
18:46:32.242 INFO MemoryStore - Block broadcast_270 stored as values in memory (estimated size 160.7 KiB, free 1917.4 MiB)
18:46:32.243 INFO MemoryStore - Block broadcast_270_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
18:46:32.243 INFO BlockManagerInfo - Added broadcast_270_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:32.243 INFO SparkContext - Created broadcast 270 from broadcast at ReadsSparkSink.java:133
18:46:32.244 INFO MemoryStore - Block broadcast_271 stored as values in memory (estimated size 163.2 KiB, free 1917.2 MiB)
18:46:32.245 INFO MemoryStore - Block broadcast_271_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.2 MiB)
18:46:32.245 INFO BlockManagerInfo - Added broadcast_271_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:32.245 INFO SparkContext - Created broadcast 271 from broadcast at BamSink.java:76
18:46:32.247 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:32.247 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:32.247 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:32.263 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:32.264 INFO DAGScheduler - Registering RDD 670 (mapToPair at SparkUtils.java:161) as input to shuffle 30
18:46:32.264 INFO DAGScheduler - Got job 103 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:32.264 INFO DAGScheduler - Final stage: ResultStage 145 (runJob at SparkHadoopWriter.scala:83)
18:46:32.264 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 144)
18:46:32.264 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 144)
18:46:32.264 INFO DAGScheduler - Submitting ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:32.281 INFO MemoryStore - Block broadcast_272 stored as values in memory (estimated size 520.4 KiB, free 1916.7 MiB)
18:46:32.283 INFO MemoryStore - Block broadcast_272_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.5 MiB)
18:46:32.283 INFO BlockManagerInfo - Added broadcast_272_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.3 MiB)
18:46:32.283 INFO SparkContext - Created broadcast 272 from broadcast at DAGScheduler.scala:1580
18:46:32.283 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:32.283 INFO TaskSchedulerImpl - Adding task set 144.0 with 1 tasks resource profile 0
18:46:32.284 INFO TaskSetManager - Starting task 0.0 in stage 144.0 (TID 200) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
18:46:32.284 INFO Executor - Running task 0.0 in stage 144.0 (TID 200)
18:46:32.313 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:32.329 INFO Executor - Finished task 0.0 in stage 144.0 (TID 200). 1148 bytes result sent to driver
18:46:32.329 INFO TaskSetManager - Finished task 0.0 in stage 144.0 (TID 200) in 45 ms on localhost (executor driver) (1/1)
18:46:32.330 INFO TaskSchedulerImpl - Removed TaskSet 144.0, whose tasks have all completed, from pool
18:46:32.330 INFO DAGScheduler - ShuffleMapStage 144 (mapToPair at SparkUtils.java:161) finished in 0.066 s
18:46:32.330 INFO DAGScheduler - looking for newly runnable stages
18:46:32.330 INFO DAGScheduler - running: HashSet()
18:46:32.330 INFO DAGScheduler - waiting: HashSet(ResultStage 145)
18:46:32.330 INFO DAGScheduler - failed: HashSet()
18:46:32.330 INFO DAGScheduler - Submitting ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91), which has no missing parents
18:46:32.337 INFO MemoryStore - Block broadcast_273 stored as values in memory (estimated size 241.4 KiB, free 1916.3 MiB)
18:46:32.338 INFO MemoryStore - Block broadcast_273_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.2 MiB)
18:46:32.338 INFO BlockManagerInfo - Added broadcast_273_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.3 MiB)
18:46:32.338 INFO SparkContext - Created broadcast 273 from broadcast at DAGScheduler.scala:1580
18:46:32.338 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:32.338 INFO TaskSchedulerImpl - Adding task set 145.0 with 1 tasks resource profile 0
18:46:32.339 INFO TaskSetManager - Starting task 0.0 in stage 145.0 (TID 201) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:32.339 INFO Executor - Running task 0.0 in stage 145.0 (TID 201)
18:46:32.343 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:32.343 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:32.354 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:32.354 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:32.354 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:32.354 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:32.354 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:32.354 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:32.377 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846326705185187657658136_0675_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest25035466088897914257.bam.parts/_temporary/0/task_202505191846326705185187657658136_0675_r_000000
18:46:32.377 INFO SparkHadoopMapRedUtil - attempt_202505191846326705185187657658136_0675_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:32.377 INFO Executor - Finished task 0.0 in stage 145.0 (TID 201). 1858 bytes result sent to driver
18:46:32.378 INFO TaskSetManager - Finished task 0.0 in stage 145.0 (TID 201) in 40 ms on localhost (executor driver) (1/1)
18:46:32.378 INFO TaskSchedulerImpl - Removed TaskSet 145.0, whose tasks have all completed, from pool
18:46:32.378 INFO DAGScheduler - ResultStage 145 (runJob at SparkHadoopWriter.scala:83) finished in 0.048 s
18:46:32.378 INFO DAGScheduler - Job 103 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.378 INFO TaskSchedulerImpl - Killing all running tasks in stage 145: Stage finished
18:46:32.378 INFO DAGScheduler - Job 103 finished: runJob at SparkHadoopWriter.scala:83, took 0.114710 s
18:46:32.378 INFO SparkHadoopWriter - Start to commit write Job job_202505191846326705185187657658136_0675.
18:46:32.383 INFO SparkHadoopWriter - Write Job job_202505191846326705185187657658136_0675 committed. Elapsed time: 5 ms.
18:46:32.395 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest25035466088897914257.bam
18:46:32.399 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest25035466088897914257.bam done
18:46:32.399 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest25035466088897914257.bam.parts/ to /tmp/ReadsSparkSinkUnitTest25035466088897914257.bam.sbi
18:46:32.403 INFO IndexFileMerger - Done merging .sbi files
18:46:32.403 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest25035466088897914257.bam.parts/ to /tmp/ReadsSparkSinkUnitTest25035466088897914257.bam.bai
18:46:32.408 INFO IndexFileMerger - Done merging .bai files
18:46:32.410 INFO MemoryStore - Block broadcast_274 stored as values in memory (estimated size 320.0 B, free 1916.2 MiB)
18:46:32.410 INFO MemoryStore - Block broadcast_274_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.2 MiB)
18:46:32.410 INFO BlockManagerInfo - Added broadcast_274_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.3 MiB)
18:46:32.411 INFO SparkContext - Created broadcast 274 from broadcast at BamSource.java:104
18:46:32.412 INFO MemoryStore - Block broadcast_275 stored as values in memory (estimated size 297.9 KiB, free 1915.9 MiB)
18:46:32.418 INFO MemoryStore - Block broadcast_275_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.9 MiB)
18:46:32.418 INFO BlockManagerInfo - Added broadcast_275_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:32.418 INFO SparkContext - Created broadcast 275 from newAPIHadoopFile at PathSplitSource.java:96
18:46:32.426 INFO FileInputFormat - Total input files to process : 1
18:46:32.440 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:32.441 INFO DAGScheduler - Got job 104 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:32.441 INFO DAGScheduler - Final stage: ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:32.441 INFO DAGScheduler - Parents of final stage: List()
18:46:32.441 INFO DAGScheduler - Missing parents: List()
18:46:32.441 INFO DAGScheduler - Submitting ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.447 INFO MemoryStore - Block broadcast_276 stored as values in memory (estimated size 148.2 KiB, free 1915.7 MiB)
18:46:32.448 INFO MemoryStore - Block broadcast_276_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.7 MiB)
18:46:32.448 INFO BlockManagerInfo - Added broadcast_276_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.2 MiB)
18:46:32.448 INFO SparkContext - Created broadcast 276 from broadcast at DAGScheduler.scala:1580
18:46:32.448 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.448 INFO TaskSchedulerImpl - Adding task set 146.0 with 1 tasks resource profile 0
18:46:32.448 INFO TaskSetManager - Starting task 0.0 in stage 146.0 (TID 202) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:32.449 INFO Executor - Running task 0.0 in stage 146.0 (TID 202)
18:46:32.460 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest25035466088897914257.bam:0+235514
18:46:32.465 INFO Executor - Finished task 0.0 in stage 146.0 (TID 202). 650184 bytes result sent to driver
18:46:32.466 INFO TaskSetManager - Finished task 0.0 in stage 146.0 (TID 202) in 18 ms on localhost (executor driver) (1/1)
18:46:32.466 INFO TaskSchedulerImpl - Removed TaskSet 146.0, whose tasks have all completed, from pool
18:46:32.466 INFO DAGScheduler - ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
18:46:32.466 INFO DAGScheduler - Job 104 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.466 INFO TaskSchedulerImpl - Killing all running tasks in stage 146: Stage finished
18:46:32.466 INFO DAGScheduler - Job 104 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025791 s
18:46:32.475 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:32.476 INFO DAGScheduler - Got job 105 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:32.476 INFO DAGScheduler - Final stage: ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185)
18:46:32.476 INFO DAGScheduler - Parents of final stage: List()
18:46:32.476 INFO DAGScheduler - Missing parents: List()
18:46:32.476 INFO DAGScheduler - Submitting ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.493 INFO MemoryStore - Block broadcast_277 stored as values in memory (estimated size 426.1 KiB, free 1915.2 MiB)
18:46:32.494 INFO MemoryStore - Block broadcast_277_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
18:46:32.494 INFO BlockManagerInfo - Added broadcast_277_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.0 MiB)
18:46:32.494 INFO SparkContext - Created broadcast 277 from broadcast at DAGScheduler.scala:1580
18:46:32.494 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.495 INFO TaskSchedulerImpl - Adding task set 147.0 with 1 tasks resource profile 0
18:46:32.495 INFO TaskSetManager - Starting task 0.0 in stage 147.0 (TID 203) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
18:46:32.495 INFO Executor - Running task 0.0 in stage 147.0 (TID 203)
18:46:32.524 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:32.535 INFO Executor - Finished task 0.0 in stage 147.0 (TID 203). 989 bytes result sent to driver
18:46:32.536 INFO TaskSetManager - Finished task 0.0 in stage 147.0 (TID 203) in 41 ms on localhost (executor driver) (1/1)
18:46:32.536 INFO TaskSchedulerImpl - Removed TaskSet 147.0, whose tasks have all completed, from pool
18:46:32.536 INFO DAGScheduler - ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
18:46:32.536 INFO DAGScheduler - Job 105 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.536 INFO TaskSchedulerImpl - Killing all running tasks in stage 147: Stage finished
18:46:32.536 INFO DAGScheduler - Job 105 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060747 s
18:46:32.539 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:32.540 INFO DAGScheduler - Got job 106 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:32.540 INFO DAGScheduler - Final stage: ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185)
18:46:32.540 INFO DAGScheduler - Parents of final stage: List()
18:46:32.540 INFO DAGScheduler - Missing parents: List()
18:46:32.540 INFO DAGScheduler - Submitting ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.546 INFO MemoryStore - Block broadcast_278 stored as values in memory (estimated size 148.1 KiB, free 1915.0 MiB)
18:46:32.546 INFO MemoryStore - Block broadcast_278_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1914.9 MiB)
18:46:32.547 INFO BlockManagerInfo - Added broadcast_278_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.0 MiB)
18:46:32.547 INFO SparkContext - Created broadcast 278 from broadcast at DAGScheduler.scala:1580
18:46:32.547 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.547 INFO TaskSchedulerImpl - Adding task set 148.0 with 1 tasks resource profile 0
18:46:32.547 INFO TaskSetManager - Starting task 0.0 in stage 148.0 (TID 204) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:32.548 INFO Executor - Running task 0.0 in stage 148.0 (TID 204)
18:46:32.558 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest25035466088897914257.bam:0+235514
18:46:32.561 INFO Executor - Finished task 0.0 in stage 148.0 (TID 204). 989 bytes result sent to driver
18:46:32.562 INFO TaskSetManager - Finished task 0.0 in stage 148.0 (TID 204) in 15 ms on localhost (executor driver) (1/1)
18:46:32.562 INFO TaskSchedulerImpl - Removed TaskSet 148.0, whose tasks have all completed, from pool
18:46:32.562 INFO DAGScheduler - ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
18:46:32.562 INFO DAGScheduler - Job 106 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.562 INFO TaskSchedulerImpl - Killing all running tasks in stage 148: Stage finished
18:46:32.562 INFO DAGScheduler - Job 106 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022614 s
18:46:32.564 INFO MemoryStore - Block broadcast_279 stored as values in memory (estimated size 298.0 KiB, free 1914.6 MiB)
18:46:32.571 INFO BlockManagerInfo - Removed broadcast_275_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.0 MiB)
18:46:32.571 INFO BlockManagerInfo - Removed broadcast_269_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.0 MiB)
18:46:32.571 INFO BlockManagerInfo - Removed broadcast_270_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.1 MiB)
18:46:32.572 INFO BlockManagerInfo - Removed broadcast_258_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:32.573 INFO BlockManagerInfo - Removed broadcast_276_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.2 MiB)
18:46:32.574 INFO BlockManagerInfo - Removed broadcast_267_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:32.574 INFO BlockManagerInfo - Removed broadcast_273_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.4 MiB)
18:46:32.575 INFO BlockManagerInfo - Removed broadcast_272_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.5 MiB)
18:46:32.575 INFO BlockManagerInfo - Removed broadcast_271_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:32.576 INFO BlockManagerInfo - Removed broadcast_277_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:32.576 INFO BlockManagerInfo - Removed broadcast_266_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.8 MiB)
18:46:32.577 INFO BlockManagerInfo - Removed broadcast_278_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.9 MiB)
18:46:32.577 INFO BlockManagerInfo - Removed broadcast_264_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1920.0 MiB)
18:46:32.578 INFO BlockManagerInfo - Removed broadcast_268_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1920.0 MiB)
18:46:32.578 INFO BlockManagerInfo - Removed broadcast_274_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1920.0 MiB)
18:46:32.579 INFO MemoryStore - Block broadcast_279_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.7 MiB)
18:46:32.579 INFO BlockManagerInfo - Added broadcast_279_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1920.0 MiB)
18:46:32.579 INFO SparkContext - Created broadcast 279 from newAPIHadoopFile at PathSplitSource.java:96
18:46:32.609 INFO MemoryStore - Block broadcast_280 stored as values in memory (estimated size 298.0 KiB, free 1919.4 MiB)
18:46:32.615 INFO MemoryStore - Block broadcast_280_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
18:46:32.615 INFO BlockManagerInfo - Added broadcast_280_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.9 MiB)
18:46:32.615 INFO SparkContext - Created broadcast 280 from newAPIHadoopFile at PathSplitSource.java:96
18:46:32.635 INFO FileInputFormat - Total input files to process : 1
18:46:32.636 INFO MemoryStore - Block broadcast_281 stored as values in memory (estimated size 19.6 KiB, free 1919.3 MiB)
18:46:32.637 INFO MemoryStore - Block broadcast_281_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1919.3 MiB)
18:46:32.637 INFO BlockManagerInfo - Added broadcast_281_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.9 MiB)
18:46:32.637 INFO SparkContext - Created broadcast 281 from broadcast at ReadsSparkSink.java:133
18:46:32.638 INFO MemoryStore - Block broadcast_282 stored as values in memory (estimated size 20.0 KiB, free 1919.3 MiB)
18:46:32.638 INFO MemoryStore - Block broadcast_282_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1919.3 MiB)
18:46:32.638 INFO BlockManagerInfo - Added broadcast_282_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.9 MiB)
18:46:32.639 INFO SparkContext - Created broadcast 282 from broadcast at BamSink.java:76
18:46:32.640 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:32.640 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:32.640 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:32.657 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:32.658 INFO DAGScheduler - Registering RDD 695 (mapToPair at SparkUtils.java:161) as input to shuffle 31
18:46:32.658 INFO DAGScheduler - Got job 107 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:32.658 INFO DAGScheduler - Final stage: ResultStage 150 (runJob at SparkHadoopWriter.scala:83)
18:46:32.658 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 149)
18:46:32.658 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 149)
18:46:32.658 INFO DAGScheduler - Submitting ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:32.676 INFO MemoryStore - Block broadcast_283 stored as values in memory (estimated size 434.3 KiB, free 1918.9 MiB)
18:46:32.678 INFO MemoryStore - Block broadcast_283_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1918.7 MiB)
18:46:32.678 INFO BlockManagerInfo - Added broadcast_283_piece0 in memory on localhost:45727 (size: 157.6 KiB, free: 1919.7 MiB)
18:46:32.678 INFO SparkContext - Created broadcast 283 from broadcast at DAGScheduler.scala:1580
18:46:32.678 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:32.678 INFO TaskSchedulerImpl - Adding task set 149.0 with 1 tasks resource profile 0
18:46:32.679 INFO TaskSetManager - Starting task 0.0 in stage 149.0 (TID 205) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
18:46:32.679 INFO Executor - Running task 0.0 in stage 149.0 (TID 205)
18:46:32.713 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:32.725 INFO Executor - Finished task 0.0 in stage 149.0 (TID 205). 1148 bytes result sent to driver
18:46:32.726 INFO TaskSetManager - Finished task 0.0 in stage 149.0 (TID 205) in 47 ms on localhost (executor driver) (1/1)
18:46:32.726 INFO TaskSchedulerImpl - Removed TaskSet 149.0, whose tasks have all completed, from pool
18:46:32.726 INFO DAGScheduler - ShuffleMapStage 149 (mapToPair at SparkUtils.java:161) finished in 0.068 s
18:46:32.726 INFO DAGScheduler - looking for newly runnable stages
18:46:32.726 INFO DAGScheduler - running: HashSet()
18:46:32.726 INFO DAGScheduler - waiting: HashSet(ResultStage 150)
18:46:32.726 INFO DAGScheduler - failed: HashSet()
18:46:32.726 INFO DAGScheduler - Submitting ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91), which has no missing parents
18:46:32.732 INFO MemoryStore - Block broadcast_284 stored as values in memory (estimated size 155.3 KiB, free 1918.5 MiB)
18:46:32.733 INFO MemoryStore - Block broadcast_284_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1918.5 MiB)
18:46:32.733 INFO BlockManagerInfo - Added broadcast_284_piece0 in memory on localhost:45727 (size: 58.5 KiB, free: 1919.7 MiB)
18:46:32.733 INFO SparkContext - Created broadcast 284 from broadcast at DAGScheduler.scala:1580
18:46:32.734 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:32.734 INFO TaskSchedulerImpl - Adding task set 150.0 with 1 tasks resource profile 0
18:46:32.734 INFO TaskSetManager - Starting task 0.0 in stage 150.0 (TID 206) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:32.734 INFO Executor - Running task 0.0 in stage 150.0 (TID 206)
18:46:32.738 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:32.738 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:32.749 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:32.749 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:32.749 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:32.749 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:32.749 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:32.749 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:32.770 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846325975941590885431921_0700_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest314363827472561368518.bam.parts/_temporary/0/task_202505191846325975941590885431921_0700_r_000000
18:46:32.770 INFO SparkHadoopMapRedUtil - attempt_202505191846325975941590885431921_0700_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:32.771 INFO Executor - Finished task 0.0 in stage 150.0 (TID 206). 1858 bytes result sent to driver
18:46:32.771 INFO TaskSetManager - Finished task 0.0 in stage 150.0 (TID 206) in 37 ms on localhost (executor driver) (1/1)
18:46:32.771 INFO TaskSchedulerImpl - Removed TaskSet 150.0, whose tasks have all completed, from pool
18:46:32.771 INFO DAGScheduler - ResultStage 150 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
18:46:32.772 INFO DAGScheduler - Job 107 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.772 INFO TaskSchedulerImpl - Killing all running tasks in stage 150: Stage finished
18:46:32.772 INFO DAGScheduler - Job 107 finished: runJob at SparkHadoopWriter.scala:83, took 0.114451 s
18:46:32.772 INFO SparkHadoopWriter - Start to commit write Job job_202505191846325975941590885431921_0700.
18:46:32.776 INFO SparkHadoopWriter - Write Job job_202505191846325975941590885431921_0700 committed. Elapsed time: 4 ms.
18:46:32.787 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest314363827472561368518.bam
18:46:32.791 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest314363827472561368518.bam done
18:46:32.791 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest314363827472561368518.bam.parts/ to /tmp/ReadsSparkSinkUnitTest314363827472561368518.bam.sbi
18:46:32.796 INFO IndexFileMerger - Done merging .sbi files
18:46:32.796 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest314363827472561368518.bam.parts/ to /tmp/ReadsSparkSinkUnitTest314363827472561368518.bam.bai
18:46:32.800 INFO IndexFileMerger - Done merging .bai files
18:46:32.802 INFO MemoryStore - Block broadcast_285 stored as values in memory (estimated size 312.0 B, free 1918.5 MiB)
18:46:32.802 INFO MemoryStore - Block broadcast_285_piece0 stored as bytes in memory (estimated size 231.0 B, free 1918.5 MiB)
18:46:32.802 INFO BlockManagerInfo - Added broadcast_285_piece0 in memory on localhost:45727 (size: 231.0 B, free: 1919.7 MiB)
18:46:32.802 INFO SparkContext - Created broadcast 285 from broadcast at BamSource.java:104
18:46:32.803 INFO MemoryStore - Block broadcast_286 stored as values in memory (estimated size 297.9 KiB, free 1918.2 MiB)
18:46:32.809 INFO MemoryStore - Block broadcast_286_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
18:46:32.809 INFO BlockManagerInfo - Added broadcast_286_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:32.810 INFO SparkContext - Created broadcast 286 from newAPIHadoopFile at PathSplitSource.java:96
18:46:32.818 INFO FileInputFormat - Total input files to process : 1
18:46:32.832 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:32.832 INFO DAGScheduler - Got job 108 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:32.832 INFO DAGScheduler - Final stage: ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:32.832 INFO DAGScheduler - Parents of final stage: List()
18:46:32.832 INFO DAGScheduler - Missing parents: List()
18:46:32.833 INFO DAGScheduler - Submitting ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.839 INFO MemoryStore - Block broadcast_287 stored as values in memory (estimated size 148.2 KiB, free 1918.0 MiB)
18:46:32.840 INFO MemoryStore - Block broadcast_287_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.0 MiB)
18:46:32.840 INFO BlockManagerInfo - Added broadcast_287_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:32.840 INFO SparkContext - Created broadcast 287 from broadcast at DAGScheduler.scala:1580
18:46:32.840 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.840 INFO TaskSchedulerImpl - Adding task set 151.0 with 1 tasks resource profile 0
18:46:32.841 INFO TaskSetManager - Starting task 0.0 in stage 151.0 (TID 207) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:32.841 INFO Executor - Running task 0.0 in stage 151.0 (TID 207)
18:46:32.852 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest314363827472561368518.bam:0+236517
18:46:32.856 INFO Executor - Finished task 0.0 in stage 151.0 (TID 207). 749470 bytes result sent to driver
18:46:32.858 INFO TaskSetManager - Finished task 0.0 in stage 151.0 (TID 207) in 16 ms on localhost (executor driver) (1/1)
18:46:32.858 INFO TaskSchedulerImpl - Removed TaskSet 151.0, whose tasks have all completed, from pool
18:46:32.858 INFO DAGScheduler - ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
18:46:32.858 INFO DAGScheduler - Job 108 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.858 INFO TaskSchedulerImpl - Killing all running tasks in stage 151: Stage finished
18:46:32.858 INFO DAGScheduler - Job 108 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025798 s
18:46:32.868 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:32.868 INFO DAGScheduler - Got job 109 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:32.868 INFO DAGScheduler - Final stage: ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185)
18:46:32.869 INFO DAGScheduler - Parents of final stage: List()
18:46:32.869 INFO DAGScheduler - Missing parents: List()
18:46:32.869 INFO DAGScheduler - Submitting ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.886 INFO MemoryStore - Block broadcast_288 stored as values in memory (estimated size 426.1 KiB, free 1917.5 MiB)
18:46:32.887 INFO MemoryStore - Block broadcast_288_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
18:46:32.887 INFO BlockManagerInfo - Added broadcast_288_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:32.887 INFO SparkContext - Created broadcast 288 from broadcast at DAGScheduler.scala:1580
18:46:32.887 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.887 INFO TaskSchedulerImpl - Adding task set 152.0 with 1 tasks resource profile 0
18:46:32.888 INFO TaskSetManager - Starting task 0.0 in stage 152.0 (TID 208) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
18:46:32.888 INFO Executor - Running task 0.0 in stage 152.0 (TID 208)
18:46:32.917 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:32.924 INFO Executor - Finished task 0.0 in stage 152.0 (TID 208). 989 bytes result sent to driver
18:46:32.924 INFO TaskSetManager - Finished task 0.0 in stage 152.0 (TID 208) in 36 ms on localhost (executor driver) (1/1)
18:46:32.924 INFO TaskSchedulerImpl - Removed TaskSet 152.0, whose tasks have all completed, from pool
18:46:32.925 INFO DAGScheduler - ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.056 s
18:46:32.925 INFO DAGScheduler - Job 109 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.925 INFO TaskSchedulerImpl - Killing all running tasks in stage 152: Stage finished
18:46:32.925 INFO DAGScheduler - Job 109 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.056458 s
18:46:32.928 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:32.928 INFO DAGScheduler - Got job 110 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:32.928 INFO DAGScheduler - Final stage: ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185)
18:46:32.928 INFO DAGScheduler - Parents of final stage: List()
18:46:32.928 INFO DAGScheduler - Missing parents: List()
18:46:32.928 INFO DAGScheduler - Submitting ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:32.934 INFO MemoryStore - Block broadcast_289 stored as values in memory (estimated size 148.1 KiB, free 1917.2 MiB)
18:46:32.935 INFO MemoryStore - Block broadcast_289_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.2 MiB)
18:46:32.935 INFO BlockManagerInfo - Added broadcast_289_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.4 MiB)
18:46:32.935 INFO SparkContext - Created broadcast 289 from broadcast at DAGScheduler.scala:1580
18:46:32.936 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:32.936 INFO TaskSchedulerImpl - Adding task set 153.0 with 1 tasks resource profile 0
18:46:32.936 INFO TaskSetManager - Starting task 0.0 in stage 153.0 (TID 209) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:32.936 INFO Executor - Running task 0.0 in stage 153.0 (TID 209)
18:46:32.949 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest314363827472561368518.bam:0+236517
18:46:32.952 INFO Executor - Finished task 0.0 in stage 153.0 (TID 209). 989 bytes result sent to driver
18:46:32.952 INFO TaskSetManager - Finished task 0.0 in stage 153.0 (TID 209) in 16 ms on localhost (executor driver) (1/1)
18:46:32.952 INFO TaskSchedulerImpl - Removed TaskSet 153.0, whose tasks have all completed, from pool
18:46:32.952 INFO DAGScheduler - ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
18:46:32.952 INFO DAGScheduler - Job 110 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:32.952 INFO TaskSchedulerImpl - Killing all running tasks in stage 153: Stage finished
18:46:32.953 INFO DAGScheduler - Job 110 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.024477 s
18:46:32.955 INFO MemoryStore - Block broadcast_290 stored as values in memory (estimated size 576.0 B, free 1917.2 MiB)
18:46:32.955 INFO MemoryStore - Block broadcast_290_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.2 MiB)
18:46:32.955 INFO BlockManagerInfo - Added broadcast_290_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.4 MiB)
18:46:32.956 INFO SparkContext - Created broadcast 290 from broadcast at CramSource.java:114
18:46:32.957 INFO MemoryStore - Block broadcast_291 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
18:46:32.963 INFO MemoryStore - Block broadcast_291_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
18:46:32.963 INFO BlockManagerInfo - Added broadcast_291_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:32.963 INFO SparkContext - Created broadcast 291 from newAPIHadoopFile at PathSplitSource.java:96
18:46:32.977 INFO MemoryStore - Block broadcast_292 stored as values in memory (estimated size 576.0 B, free 1916.8 MiB)
18:46:32.978 INFO MemoryStore - Block broadcast_292_piece0 stored as bytes in memory (estimated size 228.0 B, free 1916.8 MiB)
18:46:32.978 INFO BlockManagerInfo - Added broadcast_292_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.3 MiB)
18:46:32.978 INFO SparkContext - Created broadcast 292 from broadcast at CramSource.java:114
18:46:32.979 INFO MemoryStore - Block broadcast_293 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
18:46:32.986 INFO BlockManagerInfo - Removed broadcast_286_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:32.987 INFO BlockManagerInfo - Removed broadcast_285_piece0 on localhost:45727 in memory (size: 231.0 B, free: 1919.4 MiB)
18:46:32.987 INFO BlockManagerInfo - Removed broadcast_283_piece0 on localhost:45727 in memory (size: 157.6 KiB, free: 1919.5 MiB)
18:46:32.988 INFO BlockManagerInfo - Removed broadcast_288_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:32.988 INFO BlockManagerInfo - Removed broadcast_282_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.7 MiB)
18:46:32.989 INFO BlockManagerInfo - Removed broadcast_279_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:32.989 INFO BlockManagerInfo - Removed broadcast_287_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.8 MiB)
18:46:32.989 INFO BlockManagerInfo - Removed broadcast_289_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.8 MiB)
18:46:32.990 INFO BlockManagerInfo - Removed broadcast_284_piece0 on localhost:45727 in memory (size: 58.5 KiB, free: 1919.9 MiB)
18:46:32.990 INFO BlockManagerInfo - Removed broadcast_280_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.9 MiB)
18:46:32.991 INFO BlockManagerInfo - Removed broadcast_281_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1920.0 MiB)
18:46:32.994 INFO MemoryStore - Block broadcast_293_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
18:46:32.994 INFO BlockManagerInfo - Added broadcast_293_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.9 MiB)
18:46:32.994 INFO SparkContext - Created broadcast 293 from newAPIHadoopFile at PathSplitSource.java:96
18:46:33.009 INFO FileInputFormat - Total input files to process : 1
18:46:33.010 INFO MemoryStore - Block broadcast_294 stored as values in memory (estimated size 6.0 KiB, free 1919.3 MiB)
18:46:33.011 INFO MemoryStore - Block broadcast_294_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
18:46:33.011 INFO BlockManagerInfo - Added broadcast_294_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.9 MiB)
18:46:33.011 INFO SparkContext - Created broadcast 294 from broadcast at ReadsSparkSink.java:133
18:46:33.012 INFO MemoryStore - Block broadcast_295 stored as values in memory (estimated size 6.2 KiB, free 1919.3 MiB)
18:46:33.012 INFO MemoryStore - Block broadcast_295_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
18:46:33.012 INFO BlockManagerInfo - Added broadcast_295_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.9 MiB)
18:46:33.012 INFO SparkContext - Created broadcast 295 from broadcast at CramSink.java:76
18:46:33.014 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:33.014 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.014 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.030 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:33.030 INFO DAGScheduler - Registering RDD 718 (mapToPair at SparkUtils.java:161) as input to shuffle 32
18:46:33.031 INFO DAGScheduler - Got job 111 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:33.031 INFO DAGScheduler - Final stage: ResultStage 155 (runJob at SparkHadoopWriter.scala:83)
18:46:33.031 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 154)
18:46:33.031 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 154)
18:46:33.031 INFO DAGScheduler - Submitting ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:33.043 INFO MemoryStore - Block broadcast_296 stored as values in memory (estimated size 292.8 KiB, free 1919.0 MiB)
18:46:33.044 INFO MemoryStore - Block broadcast_296_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1918.9 MiB)
18:46:33.044 INFO BlockManagerInfo - Added broadcast_296_piece0 in memory on localhost:45727 (size: 107.3 KiB, free: 1919.8 MiB)
18:46:33.044 INFO SparkContext - Created broadcast 296 from broadcast at DAGScheduler.scala:1580
18:46:33.044 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:33.044 INFO TaskSchedulerImpl - Adding task set 154.0 with 1 tasks resource profile 0
18:46:33.045 INFO TaskSetManager - Starting task 0.0 in stage 154.0 (TID 210) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
18:46:33.045 INFO Executor - Running task 0.0 in stage 154.0 (TID 210)
18:46:33.066 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:33.077 INFO Executor - Finished task 0.0 in stage 154.0 (TID 210). 1148 bytes result sent to driver
18:46:33.078 INFO TaskSetManager - Finished task 0.0 in stage 154.0 (TID 210) in 33 ms on localhost (executor driver) (1/1)
18:46:33.078 INFO TaskSchedulerImpl - Removed TaskSet 154.0, whose tasks have all completed, from pool
18:46:33.078 INFO DAGScheduler - ShuffleMapStage 154 (mapToPair at SparkUtils.java:161) finished in 0.047 s
18:46:33.078 INFO DAGScheduler - looking for newly runnable stages
18:46:33.078 INFO DAGScheduler - running: HashSet()
18:46:33.078 INFO DAGScheduler - waiting: HashSet(ResultStage 155)
18:46:33.078 INFO DAGScheduler - failed: HashSet()
18:46:33.078 INFO DAGScheduler - Submitting ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89), which has no missing parents
18:46:33.085 INFO MemoryStore - Block broadcast_297 stored as values in memory (estimated size 153.2 KiB, free 1918.8 MiB)
18:46:33.086 INFO MemoryStore - Block broadcast_297_piece0 stored as bytes in memory (estimated size 58.0 KiB, free 1918.7 MiB)
18:46:33.086 INFO BlockManagerInfo - Added broadcast_297_piece0 in memory on localhost:45727 (size: 58.0 KiB, free: 1919.7 MiB)
18:46:33.086 INFO SparkContext - Created broadcast 297 from broadcast at DAGScheduler.scala:1580
18:46:33.086 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
18:46:33.086 INFO TaskSchedulerImpl - Adding task set 155.0 with 1 tasks resource profile 0
18:46:33.087 INFO TaskSetManager - Starting task 0.0 in stage 155.0 (TID 211) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:33.087 INFO Executor - Running task 0.0 in stage 155.0 (TID 211)
18:46:33.091 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:33.091 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:33.097 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:33.097 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.097 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.097 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:33.097 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.097 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.151 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846332902242898514439263_0723_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest516045056167857282668.cram.parts/_temporary/0/task_202505191846332902242898514439263_0723_r_000000
18:46:33.151 INFO SparkHadoopMapRedUtil - attempt_202505191846332902242898514439263_0723_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:33.151 INFO Executor - Finished task 0.0 in stage 155.0 (TID 211). 1858 bytes result sent to driver
18:46:33.152 INFO TaskSetManager - Finished task 0.0 in stage 155.0 (TID 211) in 65 ms on localhost (executor driver) (1/1)
18:46:33.152 INFO TaskSchedulerImpl - Removed TaskSet 155.0, whose tasks have all completed, from pool
18:46:33.152 INFO DAGScheduler - ResultStage 155 (runJob at SparkHadoopWriter.scala:83) finished in 0.073 s
18:46:33.152 INFO DAGScheduler - Job 111 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.152 INFO TaskSchedulerImpl - Killing all running tasks in stage 155: Stage finished
18:46:33.152 INFO DAGScheduler - Job 111 finished: runJob at SparkHadoopWriter.scala:83, took 0.121854 s
18:46:33.152 INFO SparkHadoopWriter - Start to commit write Job job_202505191846332902242898514439263_0723.
18:46:33.157 INFO SparkHadoopWriter - Write Job job_202505191846332902242898514439263_0723 committed. Elapsed time: 4 ms.
18:46:33.169 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest516045056167857282668.cram
18:46:33.173 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest516045056167857282668.cram done
18:46:33.175 INFO MemoryStore - Block broadcast_298 stored as values in memory (estimated size 504.0 B, free 1918.7 MiB)
18:46:33.176 INFO MemoryStore - Block broadcast_298_piece0 stored as bytes in memory (estimated size 160.0 B, free 1918.7 MiB)
18:46:33.176 INFO BlockManagerInfo - Added broadcast_298_piece0 in memory on localhost:45727 (size: 160.0 B, free: 1919.7 MiB)
18:46:33.176 INFO SparkContext - Created broadcast 298 from broadcast at CramSource.java:114
18:46:33.177 INFO MemoryStore - Block broadcast_299 stored as values in memory (estimated size 297.9 KiB, free 1918.4 MiB)
18:46:33.183 INFO MemoryStore - Block broadcast_299_piece0 stored as bytes in memory (estimated size 50.1 KiB, free 1918.4 MiB)
18:46:33.183 INFO BlockManagerInfo - Added broadcast_299_piece0 in memory on localhost:45727 (size: 50.1 KiB, free: 1919.7 MiB)
18:46:33.183 INFO SparkContext - Created broadcast 299 from newAPIHadoopFile at PathSplitSource.java:96
18:46:33.198 INFO FileInputFormat - Total input files to process : 1
18:46:33.223 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:33.223 INFO DAGScheduler - Got job 112 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:33.223 INFO DAGScheduler - Final stage: ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:33.223 INFO DAGScheduler - Parents of final stage: List()
18:46:33.223 INFO DAGScheduler - Missing parents: List()
18:46:33.223 INFO DAGScheduler - Submitting ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:33.241 INFO MemoryStore - Block broadcast_300 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
18:46:33.242 INFO MemoryStore - Block broadcast_300_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
18:46:33.243 INFO BlockManagerInfo - Added broadcast_300_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.6 MiB)
18:46:33.243 INFO SparkContext - Created broadcast 300 from broadcast at DAGScheduler.scala:1580
18:46:33.243 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:33.243 INFO TaskSchedulerImpl - Adding task set 156.0 with 1 tasks resource profile 0
18:46:33.243 INFO TaskSetManager - Starting task 0.0 in stage 156.0 (TID 212) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
18:46:33.244 INFO Executor - Running task 0.0 in stage 156.0 (TID 212)
18:46:33.264 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest516045056167857282668.cram:0+43713
18:46:33.286 INFO Executor - Finished task 0.0 in stage 156.0 (TID 212). 154101 bytes result sent to driver
18:46:33.287 INFO TaskSetManager - Finished task 0.0 in stage 156.0 (TID 212) in 44 ms on localhost (executor driver) (1/1)
18:46:33.287 INFO TaskSchedulerImpl - Removed TaskSet 156.0, whose tasks have all completed, from pool
18:46:33.287 INFO DAGScheduler - ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.064 s
18:46:33.287 INFO DAGScheduler - Job 112 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.287 INFO TaskSchedulerImpl - Killing all running tasks in stage 156: Stage finished
18:46:33.287 INFO DAGScheduler - Job 112 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.064585 s
18:46:33.292 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:33.293 INFO DAGScheduler - Got job 113 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:33.293 INFO DAGScheduler - Final stage: ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185)
18:46:33.293 INFO DAGScheduler - Parents of final stage: List()
18:46:33.293 INFO DAGScheduler - Missing parents: List()
18:46:33.293 INFO DAGScheduler - Submitting ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:33.304 INFO MemoryStore - Block broadcast_301 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
18:46:33.305 INFO MemoryStore - Block broadcast_301_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
18:46:33.305 INFO BlockManagerInfo - Added broadcast_301_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.5 MiB)
18:46:33.305 INFO SparkContext - Created broadcast 301 from broadcast at DAGScheduler.scala:1580
18:46:33.306 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:33.306 INFO TaskSchedulerImpl - Adding task set 157.0 with 1 tasks resource profile 0
18:46:33.306 INFO TaskSetManager - Starting task 0.0 in stage 157.0 (TID 213) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
18:46:33.307 INFO Executor - Running task 0.0 in stage 157.0 (TID 213)
18:46:33.327 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:33.334 INFO Executor - Finished task 0.0 in stage 157.0 (TID 213). 989 bytes result sent to driver
18:46:33.334 INFO TaskSetManager - Finished task 0.0 in stage 157.0 (TID 213) in 28 ms on localhost (executor driver) (1/1)
18:46:33.334 INFO TaskSchedulerImpl - Removed TaskSet 157.0, whose tasks have all completed, from pool
18:46:33.335 INFO DAGScheduler - ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.041 s
18:46:33.335 INFO DAGScheduler - Job 113 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.335 INFO TaskSchedulerImpl - Killing all running tasks in stage 157: Stage finished
18:46:33.335 INFO DAGScheduler - Job 113 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.042392 s
18:46:33.339 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:33.340 INFO DAGScheduler - Got job 114 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:33.340 INFO DAGScheduler - Final stage: ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185)
18:46:33.340 INFO DAGScheduler - Parents of final stage: List()
18:46:33.340 INFO DAGScheduler - Missing parents: List()
18:46:33.340 INFO DAGScheduler - Submitting ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:33.351 INFO MemoryStore - Block broadcast_302 stored as values in memory (estimated size 286.8 KiB, free 1917.3 MiB)
18:46:33.352 INFO MemoryStore - Block broadcast_302_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.2 MiB)
18:46:33.352 INFO BlockManagerInfo - Added broadcast_302_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.4 MiB)
18:46:33.353 INFO SparkContext - Created broadcast 302 from broadcast at DAGScheduler.scala:1580
18:46:33.353 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:33.353 INFO TaskSchedulerImpl - Adding task set 158.0 with 1 tasks resource profile 0
18:46:33.353 INFO TaskSetManager - Starting task 0.0 in stage 158.0 (TID 214) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
18:46:33.354 INFO Executor - Running task 0.0 in stage 158.0 (TID 214)
18:46:33.373 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest516045056167857282668.cram:0+43713
18:46:33.383 INFO Executor - Finished task 0.0 in stage 158.0 (TID 214). 989 bytes result sent to driver
18:46:33.383 INFO TaskSetManager - Finished task 0.0 in stage 158.0 (TID 214) in 30 ms on localhost (executor driver) (1/1)
18:46:33.383 INFO TaskSchedulerImpl - Removed TaskSet 158.0, whose tasks have all completed, from pool
18:46:33.383 INFO DAGScheduler - ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.043 s
18:46:33.383 INFO DAGScheduler - Job 114 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.383 INFO TaskSchedulerImpl - Killing all running tasks in stage 158: Stage finished
18:46:33.384 INFO DAGScheduler - Job 114 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.044057 s
18:46:33.386 INFO MemoryStore - Block broadcast_303 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
18:46:33.392 INFO MemoryStore - Block broadcast_303_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
18:46:33.392 INFO BlockManagerInfo - Added broadcast_303_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:33.393 INFO SparkContext - Created broadcast 303 from newAPIHadoopFile at PathSplitSource.java:96
18:46:33.414 INFO MemoryStore - Block broadcast_304 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
18:46:33.421 INFO MemoryStore - Block broadcast_304_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.5 MiB)
18:46:33.421 INFO BlockManagerInfo - Added broadcast_304_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:33.421 INFO SparkContext - Created broadcast 304 from newAPIHadoopFile at PathSplitSource.java:96
18:46:33.441 INFO FileInputFormat - Total input files to process : 1
18:46:33.443 INFO MemoryStore - Block broadcast_305 stored as values in memory (estimated size 160.7 KiB, free 1916.4 MiB)
18:46:33.443 INFO MemoryStore - Block broadcast_305_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.4 MiB)
18:46:33.443 INFO BlockManagerInfo - Added broadcast_305_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:33.444 INFO SparkContext - Created broadcast 305 from broadcast at ReadsSparkSink.java:133
18:46:33.447 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:33.447 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.447 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.463 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:33.464 INFO DAGScheduler - Registering RDD 743 (mapToPair at SparkUtils.java:161) as input to shuffle 33
18:46:33.464 INFO DAGScheduler - Got job 115 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:33.464 INFO DAGScheduler - Final stage: ResultStage 160 (runJob at SparkHadoopWriter.scala:83)
18:46:33.464 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 159)
18:46:33.464 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 159)
18:46:33.464 INFO DAGScheduler - Submitting ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:33.481 INFO MemoryStore - Block broadcast_306 stored as values in memory (estimated size 520.4 KiB, free 1915.9 MiB)
18:46:33.487 INFO BlockManagerInfo - Removed broadcast_297_piece0 on localhost:45727 in memory (size: 58.0 KiB, free: 1919.3 MiB)
18:46:33.487 INFO BlockManagerInfo - Removed broadcast_290_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.3 MiB)
18:46:33.487 INFO MemoryStore - Block broadcast_306_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
18:46:33.487 INFO BlockManagerInfo - Removed broadcast_298_piece0 on localhost:45727 in memory (size: 160.0 B, free: 1919.3 MiB)
18:46:33.488 INFO BlockManagerInfo - Added broadcast_306_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.2 MiB)
18:46:33.488 INFO SparkContext - Created broadcast 306 from broadcast at DAGScheduler.scala:1580
18:46:33.488 INFO BlockManagerInfo - Removed broadcast_296_piece0 on localhost:45727 in memory (size: 107.3 KiB, free: 1919.3 MiB)
18:46:33.488 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:33.488 INFO TaskSchedulerImpl - Adding task set 159.0 with 1 tasks resource profile 0
18:46:33.488 INFO BlockManagerInfo - Removed broadcast_301_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.4 MiB)
18:46:33.489 INFO TaskSetManager - Starting task 0.0 in stage 159.0 (TID 215) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:33.489 INFO BlockManagerInfo - Removed broadcast_302_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.5 MiB)
18:46:33.489 INFO Executor - Running task 0.0 in stage 159.0 (TID 215)
18:46:33.489 INFO BlockManagerInfo - Removed broadcast_291_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:33.490 INFO BlockManagerInfo - Removed broadcast_304_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:33.490 INFO BlockManagerInfo - Removed broadcast_292_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.6 MiB)
18:46:33.490 INFO BlockManagerInfo - Removed broadcast_299_piece0 on localhost:45727 in memory (size: 50.1 KiB, free: 1919.6 MiB)
18:46:33.491 INFO BlockManagerInfo - Removed broadcast_294_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.6 MiB)
18:46:33.491 INFO BlockManagerInfo - Removed broadcast_295_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.6 MiB)
18:46:33.492 INFO BlockManagerInfo - Removed broadcast_293_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:33.492 INFO BlockManagerInfo - Removed broadcast_300_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.8 MiB)
18:46:33.519 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:33.534 INFO Executor - Finished task 0.0 in stage 159.0 (TID 215). 1148 bytes result sent to driver
18:46:33.534 INFO TaskSetManager - Finished task 0.0 in stage 159.0 (TID 215) in 46 ms on localhost (executor driver) (1/1)
18:46:33.534 INFO TaskSchedulerImpl - Removed TaskSet 159.0, whose tasks have all completed, from pool
18:46:33.534 INFO DAGScheduler - ShuffleMapStage 159 (mapToPair at SparkUtils.java:161) finished in 0.070 s
18:46:33.534 INFO DAGScheduler - looking for newly runnable stages
18:46:33.534 INFO DAGScheduler - running: HashSet()
18:46:33.534 INFO DAGScheduler - waiting: HashSet(ResultStage 160)
18:46:33.534 INFO DAGScheduler - failed: HashSet()
18:46:33.535 INFO DAGScheduler - Submitting ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65), which has no missing parents
18:46:33.546 INFO MemoryStore - Block broadcast_307 stored as values in memory (estimated size 241.1 KiB, free 1918.6 MiB)
18:46:33.547 INFO MemoryStore - Block broadcast_307_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1918.5 MiB)
18:46:33.547 INFO BlockManagerInfo - Added broadcast_307_piece0 in memory on localhost:45727 (size: 66.9 KiB, free: 1919.7 MiB)
18:46:33.547 INFO SparkContext - Created broadcast 307 from broadcast at DAGScheduler.scala:1580
18:46:33.547 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
18:46:33.547 INFO TaskSchedulerImpl - Adding task set 160.0 with 1 tasks resource profile 0
18:46:33.548 INFO TaskSetManager - Starting task 0.0 in stage 160.0 (TID 216) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:33.548 INFO Executor - Running task 0.0 in stage 160.0 (TID 216)
18:46:33.552 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:33.552 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:33.563 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:33.563 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.563 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.580 INFO FileOutputCommitter - Saved output of task 'attempt_2025051918463374949645577789105_0749_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest65787526726048799445.sam.parts/_temporary/0/task_2025051918463374949645577789105_0749_m_000000
18:46:33.580 INFO SparkHadoopMapRedUtil - attempt_2025051918463374949645577789105_0749_m_000000_0: Committed. Elapsed time: 0 ms.
18:46:33.580 INFO Executor - Finished task 0.0 in stage 160.0 (TID 216). 1858 bytes result sent to driver
18:46:33.581 INFO TaskSetManager - Finished task 0.0 in stage 160.0 (TID 216) in 33 ms on localhost (executor driver) (1/1)
18:46:33.581 INFO TaskSchedulerImpl - Removed TaskSet 160.0, whose tasks have all completed, from pool
18:46:33.581 INFO DAGScheduler - ResultStage 160 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
18:46:33.581 INFO DAGScheduler - Job 115 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.581 INFO TaskSchedulerImpl - Killing all running tasks in stage 160: Stage finished
18:46:33.581 INFO DAGScheduler - Job 115 finished: runJob at SparkHadoopWriter.scala:83, took 0.117628 s
18:46:33.581 INFO SparkHadoopWriter - Start to commit write Job job_2025051918463374949645577789105_0749.
18:46:33.585 INFO SparkHadoopWriter - Write Job job_2025051918463374949645577789105_0749 committed. Elapsed time: 4 ms.
18:46:33.593 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest65787526726048799445.sam
18:46:33.598 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest65787526726048799445.sam done
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:33.601 INFO MemoryStore - Block broadcast_308 stored as values in memory (estimated size 160.7 KiB, free 1918.4 MiB)
18:46:33.602 INFO MemoryStore - Block broadcast_308_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.4 MiB)
18:46:33.602 INFO BlockManagerInfo - Added broadcast_308_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.7 MiB)
18:46:33.602 INFO SparkContext - Created broadcast 308 from broadcast at SamSource.java:78
18:46:33.603 INFO MemoryStore - Block broadcast_309 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:33.609 INFO MemoryStore - Block broadcast_309_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:33.609 INFO BlockManagerInfo - Added broadcast_309_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:33.610 INFO SparkContext - Created broadcast 309 from newAPIHadoopFile at SamSource.java:108
18:46:33.612 INFO FileInputFormat - Total input files to process : 1
18:46:33.615 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:33.615 INFO DAGScheduler - Got job 116 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:33.615 INFO DAGScheduler - Final stage: ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:33.615 INFO DAGScheduler - Parents of final stage: List()
18:46:33.616 INFO DAGScheduler - Missing parents: List()
18:46:33.616 INFO DAGScheduler - Submitting ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:33.616 INFO MemoryStore - Block broadcast_310 stored as values in memory (estimated size 7.5 KiB, free 1918.0 MiB)
18:46:33.616 INFO MemoryStore - Block broadcast_310_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.0 MiB)
18:46:33.617 INFO BlockManagerInfo - Added broadcast_310_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.7 MiB)
18:46:33.617 INFO SparkContext - Created broadcast 310 from broadcast at DAGScheduler.scala:1580
18:46:33.617 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:33.617 INFO TaskSchedulerImpl - Adding task set 161.0 with 1 tasks resource profile 0
18:46:33.617 INFO TaskSetManager - Starting task 0.0 in stage 161.0 (TID 217) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:33.618 INFO Executor - Running task 0.0 in stage 161.0 (TID 217)
18:46:33.619 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest65787526726048799445.sam:0+847558
18:46:33.634 INFO Executor - Finished task 0.0 in stage 161.0 (TID 217). 651483 bytes result sent to driver
18:46:33.636 INFO TaskSetManager - Finished task 0.0 in stage 161.0 (TID 217) in 19 ms on localhost (executor driver) (1/1)
18:46:33.636 INFO TaskSchedulerImpl - Removed TaskSet 161.0, whose tasks have all completed, from pool
18:46:33.636 INFO DAGScheduler - ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.020 s
18:46:33.636 INFO DAGScheduler - Job 116 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.636 INFO TaskSchedulerImpl - Killing all running tasks in stage 161: Stage finished
18:46:33.636 INFO DAGScheduler - Job 116 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.020825 s
18:46:33.645 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:33.646 INFO DAGScheduler - Got job 117 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:33.646 INFO DAGScheduler - Final stage: ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185)
18:46:33.646 INFO DAGScheduler - Parents of final stage: List()
18:46:33.646 INFO DAGScheduler - Missing parents: List()
18:46:33.646 INFO DAGScheduler - Submitting ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:33.663 INFO MemoryStore - Block broadcast_311 stored as values in memory (estimated size 426.1 KiB, free 1917.6 MiB)
18:46:33.664 INFO MemoryStore - Block broadcast_311_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
18:46:33.664 INFO BlockManagerInfo - Added broadcast_311_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:33.664 INFO SparkContext - Created broadcast 311 from broadcast at DAGScheduler.scala:1580
18:46:33.664 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:33.664 INFO TaskSchedulerImpl - Adding task set 162.0 with 1 tasks resource profile 0
18:46:33.665 INFO TaskSetManager - Starting task 0.0 in stage 162.0 (TID 218) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:33.665 INFO Executor - Running task 0.0 in stage 162.0 (TID 218)
18:46:33.694 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:33.707 INFO Executor - Finished task 0.0 in stage 162.0 (TID 218). 989 bytes result sent to driver
18:46:33.707 INFO TaskSetManager - Finished task 0.0 in stage 162.0 (TID 218) in 42 ms on localhost (executor driver) (1/1)
18:46:33.708 INFO TaskSchedulerImpl - Removed TaskSet 162.0, whose tasks have all completed, from pool
18:46:33.708 INFO DAGScheduler - ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.062 s
18:46:33.708 INFO DAGScheduler - Job 117 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.708 INFO TaskSchedulerImpl - Killing all running tasks in stage 162: Stage finished
18:46:33.708 INFO DAGScheduler - Job 117 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062775 s
18:46:33.712 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:33.712 INFO DAGScheduler - Got job 118 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:33.712 INFO DAGScheduler - Final stage: ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185)
18:46:33.712 INFO DAGScheduler - Parents of final stage: List()
18:46:33.713 INFO DAGScheduler - Missing parents: List()
18:46:33.713 INFO DAGScheduler - Submitting ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:33.714 INFO MemoryStore - Block broadcast_312 stored as values in memory (estimated size 7.4 KiB, free 1917.4 MiB)
18:46:33.715 INFO MemoryStore - Block broadcast_312_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1917.4 MiB)
18:46:33.715 INFO BlockManagerInfo - Added broadcast_312_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.5 MiB)
18:46:33.715 INFO SparkContext - Created broadcast 312 from broadcast at DAGScheduler.scala:1580
18:46:33.715 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:33.715 INFO TaskSchedulerImpl - Adding task set 163.0 with 1 tasks resource profile 0
18:46:33.716 INFO TaskSetManager - Starting task 0.0 in stage 163.0 (TID 219) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:33.716 INFO Executor - Running task 0.0 in stage 163.0 (TID 219)
18:46:33.718 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest65787526726048799445.sam:0+847558
18:46:33.731 INFO Executor - Finished task 0.0 in stage 163.0 (TID 219). 989 bytes result sent to driver
18:46:33.731 INFO TaskSetManager - Finished task 0.0 in stage 163.0 (TID 219) in 15 ms on localhost (executor driver) (1/1)
18:46:33.731 INFO TaskSchedulerImpl - Removed TaskSet 163.0, whose tasks have all completed, from pool
18:46:33.732 INFO DAGScheduler - ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.018 s
18:46:33.732 INFO DAGScheduler - Job 118 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.732 INFO TaskSchedulerImpl - Killing all running tasks in stage 163: Stage finished
18:46:33.732 INFO DAGScheduler - Job 118 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.019999 s
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:33.736 INFO MemoryStore - Block broadcast_313 stored as values in memory (estimated size 21.0 KiB, free 1917.4 MiB)
18:46:33.736 INFO MemoryStore - Block broadcast_313_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1917.4 MiB)
18:46:33.737 INFO BlockManagerInfo - Added broadcast_313_piece0 in memory on localhost:45727 (size: 2.4 KiB, free: 1919.5 MiB)
18:46:33.737 INFO SparkContext - Created broadcast 313 from broadcast at SamSource.java:78
18:46:33.738 INFO MemoryStore - Block broadcast_314 stored as values in memory (estimated size 298.0 KiB, free 1917.1 MiB)
18:46:33.746 INFO MemoryStore - Block broadcast_314_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.1 MiB)
18:46:33.746 INFO BlockManagerInfo - Added broadcast_314_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.4 MiB)
18:46:33.747 INFO SparkContext - Created broadcast 314 from newAPIHadoopFile at SamSource.java:108
18:46:33.751 INFO FileInputFormat - Total input files to process : 1
18:46:33.754 INFO SparkContext - Starting job: collect at SparkUtils.java:205
18:46:33.755 INFO DAGScheduler - Got job 119 (collect at SparkUtils.java:205) with 1 output partitions
18:46:33.755 INFO DAGScheduler - Final stage: ResultStage 164 (collect at SparkUtils.java:205)
18:46:33.755 INFO DAGScheduler - Parents of final stage: List()
18:46:33.755 INFO DAGScheduler - Missing parents: List()
18:46:33.755 INFO DAGScheduler - Submitting ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188), which has no missing parents
18:46:33.755 INFO MemoryStore - Block broadcast_315 stored as values in memory (estimated size 7.9 KiB, free 1917.1 MiB)
18:46:33.756 INFO MemoryStore - Block broadcast_315_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1917.1 MiB)
18:46:33.756 INFO BlockManagerInfo - Added broadcast_315_piece0 in memory on localhost:45727 (size: 3.9 KiB, free: 1919.4 MiB)
18:46:33.756 INFO SparkContext - Created broadcast 315 from broadcast at DAGScheduler.scala:1580
18:46:33.756 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
18:46:33.756 INFO TaskSchedulerImpl - Adding task set 164.0 with 1 tasks resource profile 0
18:46:33.757 INFO TaskSetManager - Starting task 0.0 in stage 164.0 (TID 220) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
18:46:33.757 INFO Executor - Running task 0.0 in stage 164.0 (TID 220)
18:46:33.758 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
18:46:33.760 INFO Executor - Finished task 0.0 in stage 164.0 (TID 220). 1657 bytes result sent to driver
18:46:33.761 INFO TaskSetManager - Finished task 0.0 in stage 164.0 (TID 220) in 5 ms on localhost (executor driver) (1/1)
18:46:33.761 INFO TaskSchedulerImpl - Removed TaskSet 164.0, whose tasks have all completed, from pool
18:46:33.761 INFO DAGScheduler - ResultStage 164 (collect at SparkUtils.java:205) finished in 0.006 s
18:46:33.761 INFO DAGScheduler - Job 119 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.761 INFO TaskSchedulerImpl - Killing all running tasks in stage 164: Stage finished
18:46:33.761 INFO DAGScheduler - Job 119 finished: collect at SparkUtils.java:205, took 0.006867 s
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:33.765 INFO MemoryStore - Block broadcast_316 stored as values in memory (estimated size 21.0 KiB, free 1917.0 MiB)
18:46:33.766 INFO MemoryStore - Block broadcast_316_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1917.0 MiB)
18:46:33.766 INFO BlockManagerInfo - Added broadcast_316_piece0 in memory on localhost:45727 (size: 2.4 KiB, free: 1919.4 MiB)
18:46:33.767 INFO SparkContext - Created broadcast 316 from broadcast at SamSource.java:78
18:46:33.768 INFO MemoryStore - Block broadcast_317 stored as values in memory (estimated size 298.0 KiB, free 1916.7 MiB)
18:46:33.774 INFO MemoryStore - Block broadcast_317_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.7 MiB)
18:46:33.774 INFO BlockManagerInfo - Added broadcast_317_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.4 MiB)
18:46:33.774 INFO SparkContext - Created broadcast 317 from newAPIHadoopFile at SamSource.java:108
18:46:33.776 INFO MemoryStore - Block broadcast_318 stored as values in memory (estimated size 21.0 KiB, free 1916.7 MiB)
18:46:33.776 INFO MemoryStore - Block broadcast_318_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1916.7 MiB)
18:46:33.776 INFO BlockManagerInfo - Added broadcast_318_piece0 in memory on localhost:45727 (size: 2.4 KiB, free: 1919.4 MiB)
18:46:33.777 INFO SparkContext - Created broadcast 318 from broadcast at ReadsSparkSink.java:133
18:46:33.777 INFO MemoryStore - Block broadcast_319 stored as values in memory (estimated size 21.5 KiB, free 1916.6 MiB)
18:46:33.782 INFO MemoryStore - Block broadcast_319_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1916.6 MiB)
18:46:33.782 INFO BlockManagerInfo - Added broadcast_319_piece0 in memory on localhost:45727 (size: 2.4 KiB, free: 1919.4 MiB)
18:46:33.782 INFO BlockManagerInfo - Removed broadcast_310_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1919.4 MiB)
18:46:33.783 INFO SparkContext - Created broadcast 319 from broadcast at BamSink.java:76
18:46:33.783 INFO BlockManagerInfo - Removed broadcast_316_piece0 on localhost:45727 in memory (size: 2.4 KiB, free: 1919.4 MiB)
18:46:33.784 INFO BlockManagerInfo - Removed broadcast_305_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.4 MiB)
18:46:33.785 INFO BlockManagerInfo - Removed broadcast_307_piece0 on localhost:45727 in memory (size: 66.9 KiB, free: 1919.5 MiB)
18:46:33.785 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:33.785 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.785 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.785 INFO BlockManagerInfo - Removed broadcast_317_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.5 MiB)
18:46:33.786 INFO BlockManagerInfo - Removed broadcast_308_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:33.786 INFO BlockManagerInfo - Removed broadcast_315_piece0 on localhost:45727 in memory (size: 3.9 KiB, free: 1919.5 MiB)
18:46:33.787 INFO BlockManagerInfo - Removed broadcast_303_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:33.787 INFO BlockManagerInfo - Removed broadcast_311_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:33.787 INFO BlockManagerInfo - Removed broadcast_309_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:33.788 INFO BlockManagerInfo - Removed broadcast_312_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1919.8 MiB)
18:46:33.788 INFO BlockManagerInfo - Removed broadcast_306_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.9 MiB)
18:46:33.803 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:33.803 INFO DAGScheduler - Got job 120 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:33.803 INFO DAGScheduler - Final stage: ResultStage 165 (runJob at SparkHadoopWriter.scala:83)
18:46:33.803 INFO DAGScheduler - Parents of final stage: List()
18:46:33.803 INFO DAGScheduler - Missing parents: List()
18:46:33.803 INFO DAGScheduler - Submitting ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91), which has no missing parents
18:46:33.810 INFO MemoryStore - Block broadcast_320 stored as values in memory (estimated size 152.3 KiB, free 1919.4 MiB)
18:46:33.811 INFO MemoryStore - Block broadcast_320_piece0 stored as bytes in memory (estimated size 56.4 KiB, free 1919.4 MiB)
18:46:33.811 INFO BlockManagerInfo - Added broadcast_320_piece0 in memory on localhost:45727 (size: 56.4 KiB, free: 1919.9 MiB)
18:46:33.811 INFO SparkContext - Created broadcast 320 from broadcast at DAGScheduler.scala:1580
18:46:33.812 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:33.812 INFO TaskSchedulerImpl - Adding task set 165.0 with 1 tasks resource profile 0
18:46:33.813 INFO TaskSetManager - Starting task 0.0 in stage 165.0 (TID 221) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
18:46:33.813 INFO Executor - Running task 0.0 in stage 165.0 (TID 221)
18:46:33.819 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
18:46:33.822 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:33.822 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.822 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.822 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:33.822 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:33.822 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:33.851 INFO FileOutputCommitter - Saved output of task 'attempt_20250519184633267566346102067468_0770_r_000000_0' to file:/tmp/ReadsSparkSinkNotSorting3230199502878123492.bam.parts/_temporary/0/task_20250519184633267566346102067468_0770_r_000000
18:46:33.851 INFO SparkHadoopMapRedUtil - attempt_20250519184633267566346102067468_0770_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:33.852 INFO Executor - Finished task 0.0 in stage 165.0 (TID 221). 1084 bytes result sent to driver
18:46:33.852 INFO TaskSetManager - Finished task 0.0 in stage 165.0 (TID 221) in 40 ms on localhost (executor driver) (1/1)
18:46:33.852 INFO TaskSchedulerImpl - Removed TaskSet 165.0, whose tasks have all completed, from pool
18:46:33.852 INFO DAGScheduler - ResultStage 165 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
18:46:33.853 INFO DAGScheduler - Job 120 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.853 INFO TaskSchedulerImpl - Killing all running tasks in stage 165: Stage finished
18:46:33.853 INFO DAGScheduler - Job 120 finished: runJob at SparkHadoopWriter.scala:83, took 0.050269 s
18:46:33.853 INFO SparkHadoopWriter - Start to commit write Job job_20250519184633267566346102067468_0770.
18:46:33.859 INFO SparkHadoopWriter - Write Job job_20250519184633267566346102067468_0770 committed. Elapsed time: 5 ms.
18:46:33.870 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkNotSorting3230199502878123492.bam
18:46:33.874 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkNotSorting3230199502878123492.bam done
18:46:33.874 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkNotSorting3230199502878123492.bam.parts/ to /tmp/ReadsSparkSinkNotSorting3230199502878123492.bam.sbi
18:46:33.879 INFO IndexFileMerger - Done merging .sbi files
18:46:33.880 INFO MemoryStore - Block broadcast_321 stored as values in memory (estimated size 192.0 B, free 1919.4 MiB)
18:46:33.880 INFO MemoryStore - Block broadcast_321_piece0 stored as bytes in memory (estimated size 127.0 B, free 1919.4 MiB)
18:46:33.880 INFO BlockManagerInfo - Added broadcast_321_piece0 in memory on localhost:45727 (size: 127.0 B, free: 1919.9 MiB)
18:46:33.881 INFO SparkContext - Created broadcast 321 from broadcast at BamSource.java:104
18:46:33.882 INFO MemoryStore - Block broadcast_322 stored as values in memory (estimated size 297.9 KiB, free 1919.1 MiB)
18:46:33.888 INFO MemoryStore - Block broadcast_322_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.0 MiB)
18:46:33.888 INFO BlockManagerInfo - Added broadcast_322_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:33.888 INFO SparkContext - Created broadcast 322 from newAPIHadoopFile at PathSplitSource.java:96
18:46:33.897 INFO FileInputFormat - Total input files to process : 1
18:46:33.911 INFO SparkContext - Starting job: collect at SparkUtils.java:205
18:46:33.911 INFO DAGScheduler - Got job 121 (collect at SparkUtils.java:205) with 1 output partitions
18:46:33.911 INFO DAGScheduler - Final stage: ResultStage 166 (collect at SparkUtils.java:205)
18:46:33.911 INFO DAGScheduler - Parents of final stage: List()
18:46:33.911 INFO DAGScheduler - Missing parents: List()
18:46:33.911 INFO DAGScheduler - Submitting ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188), which has no missing parents
18:46:33.917 INFO MemoryStore - Block broadcast_323 stored as values in memory (estimated size 148.6 KiB, free 1918.9 MiB)
18:46:33.918 INFO MemoryStore - Block broadcast_323_piece0 stored as bytes in memory (estimated size 54.7 KiB, free 1918.8 MiB)
18:46:33.918 INFO BlockManagerInfo - Added broadcast_323_piece0 in memory on localhost:45727 (size: 54.7 KiB, free: 1919.8 MiB)
18:46:33.918 INFO SparkContext - Created broadcast 323 from broadcast at DAGScheduler.scala:1580
18:46:33.919 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
18:46:33.919 INFO TaskSchedulerImpl - Adding task set 166.0 with 1 tasks resource profile 0
18:46:33.919 INFO TaskSetManager - Starting task 0.0 in stage 166.0 (TID 222) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:33.919 INFO Executor - Running task 0.0 in stage 166.0 (TID 222)
18:46:33.931 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting3230199502878123492.bam:0+59395
18:46:33.932 INFO Executor - Finished task 0.0 in stage 166.0 (TID 222). 1700 bytes result sent to driver
18:46:33.933 INFO TaskSetManager - Finished task 0.0 in stage 166.0 (TID 222) in 14 ms on localhost (executor driver) (1/1)
18:46:33.933 INFO TaskSchedulerImpl - Removed TaskSet 166.0, whose tasks have all completed, from pool
18:46:33.933 INFO DAGScheduler - ResultStage 166 (collect at SparkUtils.java:205) finished in 0.021 s
18:46:33.933 INFO DAGScheduler - Job 121 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.933 INFO TaskSchedulerImpl - Killing all running tasks in stage 166: Stage finished
18:46:33.933 INFO DAGScheduler - Job 121 finished: collect at SparkUtils.java:205, took 0.022173 s
18:46:33.954 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:91
18:46:33.954 INFO DAGScheduler - Got job 122 (collect at ReadsSparkSinkUnitTest.java:91) with 1 output partitions
18:46:33.954 INFO DAGScheduler - Final stage: ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91)
18:46:33.954 INFO DAGScheduler - Parents of final stage: List()
18:46:33.954 INFO DAGScheduler - Missing parents: List()
18:46:33.954 INFO DAGScheduler - Submitting ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244), which has no missing parents
18:46:33.964 INFO MemoryStore - Block broadcast_324 stored as values in memory (estimated size 149.8 KiB, free 1918.7 MiB)
18:46:33.965 INFO MemoryStore - Block broadcast_324_piece0 stored as bytes in memory (estimated size 55.2 KiB, free 1918.6 MiB)
18:46:33.965 INFO BlockManagerInfo - Added broadcast_324_piece0 in memory on localhost:45727 (size: 55.2 KiB, free: 1919.7 MiB)
18:46:33.965 INFO SparkContext - Created broadcast 324 from broadcast at DAGScheduler.scala:1580
18:46:33.966 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
18:46:33.966 INFO TaskSchedulerImpl - Adding task set 167.0 with 1 tasks resource profile 0
18:46:33.966 INFO TaskSetManager - Starting task 0.0 in stage 167.0 (TID 223) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8435 bytes)
18:46:33.966 INFO Executor - Running task 0.0 in stage 167.0 (TID 223)
18:46:33.978 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting3230199502878123492.bam:0+59395
18:46:33.980 INFO Executor - Finished task 0.0 in stage 167.0 (TID 223). 192451 bytes result sent to driver
18:46:33.981 INFO TaskSetManager - Finished task 0.0 in stage 167.0 (TID 223) in 15 ms on localhost (executor driver) (1/1)
18:46:33.981 INFO TaskSchedulerImpl - Removed TaskSet 167.0, whose tasks have all completed, from pool
18:46:33.981 INFO DAGScheduler - ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91) finished in 0.027 s
18:46:33.981 INFO DAGScheduler - Job 122 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:33.981 INFO TaskSchedulerImpl - Killing all running tasks in stage 167: Stage finished
18:46:33.981 INFO DAGScheduler - Job 122 finished: collect at ReadsSparkSinkUnitTest.java:91, took 0.027222 s
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-05-19 18:46:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:33.983 INFO MemoryStore - Block broadcast_325 stored as values in memory (estimated size 21.0 KiB, free 1918.6 MiB)
18:46:33.983 INFO MemoryStore - Block broadcast_325_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1918.6 MiB)
18:46:33.983 INFO BlockManagerInfo - Added broadcast_325_piece0 in memory on localhost:45727 (size: 2.4 KiB, free: 1919.7 MiB)
18:46:33.984 INFO SparkContext - Created broadcast 325 from broadcast at SamSource.java:78
18:46:33.985 INFO MemoryStore - Block broadcast_326 stored as values in memory (estimated size 298.0 KiB, free 1918.3 MiB)
18:46:33.995 INFO MemoryStore - Block broadcast_326_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.3 MiB)
18:46:33.995 INFO BlockManagerInfo - Added broadcast_326_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.7 MiB)
18:46:33.995 INFO SparkContext - Created broadcast 326 from newAPIHadoopFile at SamSource.java:108
18:46:33.999 INFO FileInputFormat - Total input files to process : 1
18:46:34.004 INFO SparkContext - Starting job: collect at SparkUtils.java:205
18:46:34.005 INFO DAGScheduler - Got job 123 (collect at SparkUtils.java:205) with 1 output partitions
18:46:34.005 INFO DAGScheduler - Final stage: ResultStage 168 (collect at SparkUtils.java:205)
18:46:34.005 INFO DAGScheduler - Parents of final stage: List()
18:46:34.005 INFO DAGScheduler - Missing parents: List()
18:46:34.005 INFO DAGScheduler - Submitting ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188), which has no missing parents
18:46:34.005 INFO MemoryStore - Block broadcast_327 stored as values in memory (estimated size 7.9 KiB, free 1918.3 MiB)
18:46:34.006 INFO MemoryStore - Block broadcast_327_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1918.3 MiB)
18:46:34.006 INFO BlockManagerInfo - Added broadcast_327_piece0 in memory on localhost:45727 (size: 3.9 KiB, free: 1919.7 MiB)
18:46:34.006 INFO SparkContext - Created broadcast 327 from broadcast at DAGScheduler.scala:1580
18:46:34.006 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
18:46:34.006 INFO TaskSchedulerImpl - Adding task set 168.0 with 1 tasks resource profile 0
18:46:34.007 INFO TaskSetManager - Starting task 0.0 in stage 168.0 (TID 224) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
18:46:34.007 INFO Executor - Running task 0.0 in stage 168.0 (TID 224)
18:46:34.008 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
18:46:34.010 INFO Executor - Finished task 0.0 in stage 168.0 (TID 224). 1657 bytes result sent to driver
18:46:34.010 INFO TaskSetManager - Finished task 0.0 in stage 168.0 (TID 224) in 3 ms on localhost (executor driver) (1/1)
18:46:34.010 INFO TaskSchedulerImpl - Removed TaskSet 168.0, whose tasks have all completed, from pool
18:46:34.010 INFO DAGScheduler - ResultStage 168 (collect at SparkUtils.java:205) finished in 0.005 s
18:46:34.010 INFO DAGScheduler - Job 123 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.010 INFO TaskSchedulerImpl - Killing all running tasks in stage 168: Stage finished
18:46:34.010 INFO DAGScheduler - Job 123 finished: collect at SparkUtils.java:205, took 0.005955 s
18:46:34.017 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:94
18:46:34.017 INFO DAGScheduler - Got job 124 (collect at ReadsSparkSinkUnitTest.java:94) with 1 output partitions
18:46:34.018 INFO DAGScheduler - Final stage: ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94)
18:46:34.018 INFO DAGScheduler - Parents of final stage: List()
18:46:34.018 INFO DAGScheduler - Missing parents: List()
18:46:34.018 INFO DAGScheduler - Submitting ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244), which has no missing parents
18:46:34.018 INFO MemoryStore - Block broadcast_328 stored as values in memory (estimated size 9.6 KiB, free 1918.3 MiB)
18:46:34.019 INFO MemoryStore - Block broadcast_328_piece0 stored as bytes in memory (estimated size 4.4 KiB, free 1918.3 MiB)
18:46:34.019 INFO BlockManagerInfo - Added broadcast_328_piece0 in memory on localhost:45727 (size: 4.4 KiB, free: 1919.7 MiB)
18:46:34.019 INFO SparkContext - Created broadcast 328 from broadcast at DAGScheduler.scala:1580
18:46:34.019 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
18:46:34.019 INFO TaskSchedulerImpl - Adding task set 169.0 with 1 tasks resource profile 0
18:46:34.020 INFO TaskSetManager - Starting task 0.0 in stage 169.0 (TID 225) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
18:46:34.020 INFO Executor - Running task 0.0 in stage 169.0 (TID 225)
18:46:34.021 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
18:46:34.030 INFO Executor - Finished task 0.0 in stage 169.0 (TID 225). 192494 bytes result sent to driver
18:46:34.031 INFO TaskSetManager - Finished task 0.0 in stage 169.0 (TID 225) in 12 ms on localhost (executor driver) (1/1)
18:46:34.031 INFO TaskSchedulerImpl - Removed TaskSet 169.0, whose tasks have all completed, from pool
18:46:34.031 INFO DAGScheduler - ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94) finished in 0.013 s
18:46:34.031 INFO DAGScheduler - Job 124 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.031 INFO TaskSchedulerImpl - Killing all running tasks in stage 169: Stage finished
18:46:34.031 INFO DAGScheduler - Job 124 finished: collect at ReadsSparkSinkUnitTest.java:94, took 0.014096 s
18:46:34.040 INFO MemoryStore - Block broadcast_329 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
18:46:34.047 INFO MemoryStore - Block broadcast_329_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
18:46:34.047 INFO BlockManagerInfo - Added broadcast_329_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:34.047 INFO SparkContext - Created broadcast 329 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.069 INFO MemoryStore - Block broadcast_330 stored as values in memory (estimated size 297.9 KiB, free 1917.6 MiB)
18:46:34.075 INFO MemoryStore - Block broadcast_330_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.6 MiB)
18:46:34.075 INFO BlockManagerInfo - Added broadcast_330_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:34.075 INFO SparkContext - Created broadcast 330 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.095 INFO FileInputFormat - Total input files to process : 1
18:46:34.097 INFO MemoryStore - Block broadcast_331 stored as values in memory (estimated size 160.7 KiB, free 1917.4 MiB)
18:46:34.097 INFO MemoryStore - Block broadcast_331_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.4 MiB)
18:46:34.098 INFO BlockManagerInfo - Added broadcast_331_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:34.098 INFO SparkContext - Created broadcast 331 from broadcast at ReadsSparkSink.java:133
18:46:34.099 INFO MemoryStore - Block broadcast_332 stored as values in memory (estimated size 163.2 KiB, free 1917.3 MiB)
18:46:34.104 INFO BlockManagerInfo - Removed broadcast_324_piece0 on localhost:45727 in memory (size: 55.2 KiB, free: 1919.6 MiB)
18:46:34.104 INFO MemoryStore - Block broadcast_332_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.4 MiB)
18:46:34.104 INFO BlockManagerInfo - Added broadcast_332_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:34.104 INFO SparkContext - Created broadcast 332 from broadcast at BamSink.java:76
18:46:34.104 INFO BlockManagerInfo - Removed broadcast_320_piece0 on localhost:45727 in memory (size: 56.4 KiB, free: 1919.7 MiB)
18:46:34.105 INFO BlockManagerInfo - Removed broadcast_328_piece0 on localhost:45727 in memory (size: 4.4 KiB, free: 1919.7 MiB)
18:46:34.105 INFO BlockManagerInfo - Removed broadcast_321_piece0 on localhost:45727 in memory (size: 127.0 B, free: 1919.7 MiB)
18:46:34.106 INFO BlockManagerInfo - Removed broadcast_330_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:34.106 INFO BlockManagerInfo - Removed broadcast_319_piece0 on localhost:45727 in memory (size: 2.4 KiB, free: 1919.7 MiB)
18:46:34.106 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:34.106 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:34.106 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:34.107 INFO BlockManagerInfo - Removed broadcast_322_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:34.107 INFO BlockManagerInfo - Removed broadcast_318_piece0 on localhost:45727 in memory (size: 2.4 KiB, free: 1919.8 MiB)
18:46:34.108 INFO BlockManagerInfo - Removed broadcast_327_piece0 on localhost:45727 in memory (size: 3.9 KiB, free: 1919.8 MiB)
18:46:34.109 INFO BlockManagerInfo - Removed broadcast_314_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.8 MiB)
18:46:34.109 INFO BlockManagerInfo - Removed broadcast_323_piece0 on localhost:45727 in memory (size: 54.7 KiB, free: 1919.9 MiB)
18:46:34.110 INFO BlockManagerInfo - Removed broadcast_325_piece0 on localhost:45727 in memory (size: 2.4 KiB, free: 1919.9 MiB)
18:46:34.110 INFO BlockManagerInfo - Removed broadcast_313_piece0 on localhost:45727 in memory (size: 2.4 KiB, free: 1919.9 MiB)
18:46:34.111 INFO BlockManagerInfo - Removed broadcast_326_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.9 MiB)
18:46:34.125 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:34.126 INFO DAGScheduler - Registering RDD 803 (mapToPair at SparkUtils.java:161) as input to shuffle 34
18:46:34.126 INFO DAGScheduler - Got job 125 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:34.126 INFO DAGScheduler - Final stage: ResultStage 171 (runJob at SparkHadoopWriter.scala:83)
18:46:34.126 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 170)
18:46:34.126 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 170)
18:46:34.126 INFO DAGScheduler - Submitting ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:34.149 INFO MemoryStore - Block broadcast_333 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
18:46:34.151 INFO MemoryStore - Block broadcast_333_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
18:46:34.151 INFO BlockManagerInfo - Added broadcast_333_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.8 MiB)
18:46:34.151 INFO SparkContext - Created broadcast 333 from broadcast at DAGScheduler.scala:1580
18:46:34.151 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:34.151 INFO TaskSchedulerImpl - Adding task set 170.0 with 1 tasks resource profile 0
18:46:34.152 INFO TaskSetManager - Starting task 0.0 in stage 170.0 (TID 226) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:34.152 INFO Executor - Running task 0.0 in stage 170.0 (TID 226)
18:46:34.182 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:34.197 INFO Executor - Finished task 0.0 in stage 170.0 (TID 226). 1148 bytes result sent to driver
18:46:34.198 INFO TaskSetManager - Finished task 0.0 in stage 170.0 (TID 226) in 46 ms on localhost (executor driver) (1/1)
18:46:34.198 INFO TaskSchedulerImpl - Removed TaskSet 170.0, whose tasks have all completed, from pool
18:46:34.198 INFO DAGScheduler - ShuffleMapStage 170 (mapToPair at SparkUtils.java:161) finished in 0.072 s
18:46:34.198 INFO DAGScheduler - looking for newly runnable stages
18:46:34.198 INFO DAGScheduler - running: HashSet()
18:46:34.198 INFO DAGScheduler - waiting: HashSet(ResultStage 171)
18:46:34.198 INFO DAGScheduler - failed: HashSet()
18:46:34.198 INFO DAGScheduler - Submitting ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91), which has no missing parents
18:46:34.205 INFO MemoryStore - Block broadcast_334 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
18:46:34.206 INFO MemoryStore - Block broadcast_334_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
18:46:34.206 INFO BlockManagerInfo - Added broadcast_334_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.7 MiB)
18:46:34.206 INFO SparkContext - Created broadcast 334 from broadcast at DAGScheduler.scala:1580
18:46:34.206 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:34.206 INFO TaskSchedulerImpl - Adding task set 171.0 with 1 tasks resource profile 0
18:46:34.206 INFO TaskSetManager - Starting task 0.0 in stage 171.0 (TID 227) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:34.207 INFO Executor - Running task 0.0 in stage 171.0 (TID 227)
18:46:34.211 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:34.211 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:34.223 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:34.223 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:34.223 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:34.223 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:34.223 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:34.223 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:34.252 INFO FileOutputCommitter - Saved output of task 'attempt_20250519184634920272910726285255_0808_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace14697208361303121310/_temporary/0/task_20250519184634920272910726285255_0808_r_000000
18:46:34.252 INFO SparkHadoopMapRedUtil - attempt_20250519184634920272910726285255_0808_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:34.252 INFO Executor - Finished task 0.0 in stage 171.0 (TID 227). 1858 bytes result sent to driver
18:46:34.253 INFO TaskSetManager - Finished task 0.0 in stage 171.0 (TID 227) in 47 ms on localhost (executor driver) (1/1)
18:46:34.253 INFO TaskSchedulerImpl - Removed TaskSet 171.0, whose tasks have all completed, from pool
18:46:34.253 INFO DAGScheduler - ResultStage 171 (runJob at SparkHadoopWriter.scala:83) finished in 0.055 s
18:46:34.253 INFO DAGScheduler - Job 125 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.253 INFO TaskSchedulerImpl - Killing all running tasks in stage 171: Stage finished
18:46:34.253 INFO DAGScheduler - Job 125 finished: runJob at SparkHadoopWriter.scala:83, took 0.127702 s
18:46:34.253 INFO SparkHadoopWriter - Start to commit write Job job_20250519184634920272910726285255_0808.
18:46:34.260 INFO SparkHadoopWriter - Write Job job_20250519184634920272910726285255_0808 committed. Elapsed time: 6 ms.
18:46:34.271 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest117317550688400247213.bam
18:46:34.276 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest117317550688400247213.bam done
18:46:34.276 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace14697208361303121310 to /tmp/ReadsSparkSinkUnitTest117317550688400247213.bam.sbi
18:46:34.282 INFO IndexFileMerger - Done merging .sbi files
18:46:34.282 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace14697208361303121310 to /tmp/ReadsSparkSinkUnitTest117317550688400247213.bam.bai
18:46:34.287 INFO IndexFileMerger - Done merging .bai files
18:46:34.290 INFO MemoryStore - Block broadcast_335 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
18:46:34.290 INFO MemoryStore - Block broadcast_335_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
18:46:34.290 INFO BlockManagerInfo - Added broadcast_335_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.7 MiB)
18:46:34.291 INFO SparkContext - Created broadcast 335 from broadcast at BamSource.java:104
18:46:34.292 INFO MemoryStore - Block broadcast_336 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:34.302 INFO MemoryStore - Block broadcast_336_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:34.302 INFO BlockManagerInfo - Added broadcast_336_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:34.302 INFO SparkContext - Created broadcast 336 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.316 INFO FileInputFormat - Total input files to process : 1
18:46:34.340 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:34.340 INFO DAGScheduler - Got job 126 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:34.340 INFO DAGScheduler - Final stage: ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:34.340 INFO DAGScheduler - Parents of final stage: List()
18:46:34.340 INFO DAGScheduler - Missing parents: List()
18:46:34.340 INFO DAGScheduler - Submitting ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:34.350 INFO MemoryStore - Block broadcast_337 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
18:46:34.350 INFO MemoryStore - Block broadcast_337_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
18:46:34.351 INFO BlockManagerInfo - Added broadcast_337_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:34.351 INFO SparkContext - Created broadcast 337 from broadcast at DAGScheduler.scala:1580
18:46:34.351 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:34.351 INFO TaskSchedulerImpl - Adding task set 172.0 with 1 tasks resource profile 0
18:46:34.352 INFO TaskSetManager - Starting task 0.0 in stage 172.0 (TID 228) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:34.352 INFO Executor - Running task 0.0 in stage 172.0 (TID 228)
18:46:34.369 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117317550688400247213.bam:0+237038
18:46:34.375 INFO Executor - Finished task 0.0 in stage 172.0 (TID 228). 651526 bytes result sent to driver
18:46:34.377 INFO TaskSetManager - Finished task 0.0 in stage 172.0 (TID 228) in 26 ms on localhost (executor driver) (1/1)
18:46:34.377 INFO TaskSchedulerImpl - Removed TaskSet 172.0, whose tasks have all completed, from pool
18:46:34.378 INFO DAGScheduler - ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.038 s
18:46:34.378 INFO DAGScheduler - Job 126 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.378 INFO TaskSchedulerImpl - Killing all running tasks in stage 172: Stage finished
18:46:34.378 INFO DAGScheduler - Job 126 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.038165 s
18:46:34.394 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:34.394 INFO DAGScheduler - Got job 127 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:34.394 INFO DAGScheduler - Final stage: ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185)
18:46:34.394 INFO DAGScheduler - Parents of final stage: List()
18:46:34.394 INFO DAGScheduler - Missing parents: List()
18:46:34.394 INFO DAGScheduler - Submitting ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:34.411 INFO MemoryStore - Block broadcast_338 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
18:46:34.413 INFO MemoryStore - Block broadcast_338_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
18:46:34.413 INFO BlockManagerInfo - Added broadcast_338_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:34.413 INFO SparkContext - Created broadcast 338 from broadcast at DAGScheduler.scala:1580
18:46:34.413 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:34.413 INFO TaskSchedulerImpl - Adding task set 173.0 with 1 tasks resource profile 0
18:46:34.414 INFO TaskSetManager - Starting task 0.0 in stage 173.0 (TID 229) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:34.414 INFO Executor - Running task 0.0 in stage 173.0 (TID 229)
18:46:34.442 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:34.453 INFO Executor - Finished task 0.0 in stage 173.0 (TID 229). 989 bytes result sent to driver
18:46:34.453 INFO TaskSetManager - Finished task 0.0 in stage 173.0 (TID 229) in 39 ms on localhost (executor driver) (1/1)
18:46:34.453 INFO TaskSchedulerImpl - Removed TaskSet 173.0, whose tasks have all completed, from pool
18:46:34.453 INFO DAGScheduler - ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
18:46:34.453 INFO DAGScheduler - Job 127 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.453 INFO TaskSchedulerImpl - Killing all running tasks in stage 173: Stage finished
18:46:34.453 INFO DAGScheduler - Job 127 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059736 s
18:46:34.457 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:34.457 INFO DAGScheduler - Got job 128 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:34.457 INFO DAGScheduler - Final stage: ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185)
18:46:34.457 INFO DAGScheduler - Parents of final stage: List()
18:46:34.457 INFO DAGScheduler - Missing parents: List()
18:46:34.457 INFO DAGScheduler - Submitting ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:34.463 INFO MemoryStore - Block broadcast_339 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
18:46:34.464 INFO MemoryStore - Block broadcast_339_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
18:46:34.464 INFO BlockManagerInfo - Added broadcast_339_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.4 MiB)
18:46:34.464 INFO SparkContext - Created broadcast 339 from broadcast at DAGScheduler.scala:1580
18:46:34.464 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:34.465 INFO TaskSchedulerImpl - Adding task set 174.0 with 1 tasks resource profile 0
18:46:34.465 INFO TaskSetManager - Starting task 0.0 in stage 174.0 (TID 230) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:34.465 INFO Executor - Running task 0.0 in stage 174.0 (TID 230)
18:46:34.476 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117317550688400247213.bam:0+237038
18:46:34.479 INFO Executor - Finished task 0.0 in stage 174.0 (TID 230). 989 bytes result sent to driver
18:46:34.479 INFO TaskSetManager - Finished task 0.0 in stage 174.0 (TID 230) in 14 ms on localhost (executor driver) (1/1)
18:46:34.479 INFO TaskSchedulerImpl - Removed TaskSet 174.0, whose tasks have all completed, from pool
18:46:34.480 INFO DAGScheduler - ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
18:46:34.480 INFO DAGScheduler - Job 128 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.480 INFO TaskSchedulerImpl - Killing all running tasks in stage 174: Stage finished
18:46:34.480 INFO DAGScheduler - Job 128 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022886 s
18:46:34.487 INFO MemoryStore - Block broadcast_340 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
18:46:34.494 INFO MemoryStore - Block broadcast_340_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
18:46:34.494 INFO BlockManagerInfo - Added broadcast_340_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:34.494 INFO SparkContext - Created broadcast 340 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.515 INFO MemoryStore - Block broadcast_341 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:34.521 INFO MemoryStore - Block broadcast_341_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:34.521 INFO BlockManagerInfo - Added broadcast_341_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:34.521 INFO SparkContext - Created broadcast 341 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.540 INFO FileInputFormat - Total input files to process : 1
18:46:34.542 INFO MemoryStore - Block broadcast_342 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
18:46:34.543 INFO MemoryStore - Block broadcast_342_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
18:46:34.543 INFO BlockManagerInfo - Added broadcast_342_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:34.543 INFO SparkContext - Created broadcast 342 from broadcast at ReadsSparkSink.java:133
18:46:34.544 INFO MemoryStore - Block broadcast_343 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
18:46:34.545 INFO MemoryStore - Block broadcast_343_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
18:46:34.545 INFO BlockManagerInfo - Added broadcast_343_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:34.545 INFO SparkContext - Created broadcast 343 from broadcast at BamSink.java:76
18:46:34.547 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:34.547 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:34.547 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:34.563 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:34.563 INFO DAGScheduler - Registering RDD 828 (mapToPair at SparkUtils.java:161) as input to shuffle 35
18:46:34.564 INFO DAGScheduler - Got job 129 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:34.564 INFO DAGScheduler - Final stage: ResultStage 176 (runJob at SparkHadoopWriter.scala:83)
18:46:34.564 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 175)
18:46:34.564 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 175)
18:46:34.564 INFO DAGScheduler - Submitting ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:34.581 INFO MemoryStore - Block broadcast_344 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
18:46:34.582 INFO MemoryStore - Block broadcast_344_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
18:46:34.582 INFO BlockManagerInfo - Added broadcast_344_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.1 MiB)
18:46:34.583 INFO SparkContext - Created broadcast 344 from broadcast at DAGScheduler.scala:1580
18:46:34.583 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:34.583 INFO TaskSchedulerImpl - Adding task set 175.0 with 1 tasks resource profile 0
18:46:34.583 INFO TaskSetManager - Starting task 0.0 in stage 175.0 (TID 231) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:34.583 INFO Executor - Running task 0.0 in stage 175.0 (TID 231)
18:46:34.614 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:34.630 INFO Executor - Finished task 0.0 in stage 175.0 (TID 231). 1148 bytes result sent to driver
18:46:34.631 INFO TaskSetManager - Finished task 0.0 in stage 175.0 (TID 231) in 48 ms on localhost (executor driver) (1/1)
18:46:34.631 INFO TaskSchedulerImpl - Removed TaskSet 175.0, whose tasks have all completed, from pool
18:46:34.631 INFO DAGScheduler - ShuffleMapStage 175 (mapToPair at SparkUtils.java:161) finished in 0.067 s
18:46:34.631 INFO DAGScheduler - looking for newly runnable stages
18:46:34.631 INFO DAGScheduler - running: HashSet()
18:46:34.631 INFO DAGScheduler - waiting: HashSet(ResultStage 176)
18:46:34.631 INFO DAGScheduler - failed: HashSet()
18:46:34.631 INFO DAGScheduler - Submitting ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91), which has no missing parents
18:46:34.638 INFO MemoryStore - Block broadcast_345 stored as values in memory (estimated size 241.4 KiB, free 1915.1 MiB)
18:46:34.643 INFO MemoryStore - Block broadcast_345_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.1 MiB)
18:46:34.643 INFO BlockManagerInfo - Added broadcast_345_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.1 MiB)
18:46:34.643 INFO BlockManagerInfo - Removed broadcast_338_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.2 MiB)
18:46:34.643 INFO SparkContext - Created broadcast 345 from broadcast at DAGScheduler.scala:1580
18:46:34.643 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:34.643 INFO TaskSchedulerImpl - Adding task set 176.0 with 1 tasks resource profile 0
18:46:34.643 INFO BlockManagerInfo - Removed broadcast_331_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.2 MiB)
18:46:34.644 INFO BlockManagerInfo - Removed broadcast_341_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:34.644 INFO TaskSetManager - Starting task 0.0 in stage 176.0 (TID 232) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:34.644 INFO BlockManagerInfo - Removed broadcast_333_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.4 MiB)
18:46:34.645 INFO Executor - Running task 0.0 in stage 176.0 (TID 232)
18:46:34.646 INFO BlockManagerInfo - Removed broadcast_337_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.5 MiB)
18:46:34.647 INFO BlockManagerInfo - Removed broadcast_335_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.5 MiB)
18:46:34.647 INFO BlockManagerInfo - Removed broadcast_336_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:34.648 INFO BlockManagerInfo - Removed broadcast_334_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.6 MiB)
18:46:34.649 INFO BlockManagerInfo - Removed broadcast_332_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:34.649 INFO BlockManagerInfo - Removed broadcast_339_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.7 MiB)
18:46:34.650 INFO BlockManagerInfo - Removed broadcast_329_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:34.651 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:34.651 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:34.666 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:34.666 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:34.666 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:34.666 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:34.666 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:34.666 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:34.690 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846341824668441490853773_0833_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace10038445146793140847/_temporary/0/task_202505191846341824668441490853773_0833_r_000000
18:46:34.690 INFO SparkHadoopMapRedUtil - attempt_202505191846341824668441490853773_0833_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:34.691 INFO Executor - Finished task 0.0 in stage 176.0 (TID 232). 1858 bytes result sent to driver
18:46:34.692 INFO TaskSetManager - Finished task 0.0 in stage 176.0 (TID 232) in 48 ms on localhost (executor driver) (1/1)
18:46:34.692 INFO TaskSchedulerImpl - Removed TaskSet 176.0, whose tasks have all completed, from pool
18:46:34.692 INFO DAGScheduler - ResultStage 176 (runJob at SparkHadoopWriter.scala:83) finished in 0.061 s
18:46:34.692 INFO DAGScheduler - Job 129 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.692 INFO TaskSchedulerImpl - Killing all running tasks in stage 176: Stage finished
18:46:34.692 INFO DAGScheduler - Job 129 finished: runJob at SparkHadoopWriter.scala:83, took 0.129200 s
18:46:34.693 INFO SparkHadoopWriter - Start to commit write Job job_202505191846341824668441490853773_0833.
18:46:34.698 INFO SparkHadoopWriter - Write Job job_202505191846341824668441490853773_0833 committed. Elapsed time: 4 ms.
18:46:34.709 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest17804319436065731535.bam
18:46:34.713 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest17804319436065731535.bam done
18:46:34.713 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace10038445146793140847 to /tmp/ReadsSparkSinkUnitTest17804319436065731535.bam.sbi
18:46:34.718 INFO IndexFileMerger - Done merging .sbi files
18:46:34.718 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace10038445146793140847 to /tmp/ReadsSparkSinkUnitTest17804319436065731535.bam.bai
18:46:34.723 INFO IndexFileMerger - Done merging .bai files
18:46:34.725 INFO MemoryStore - Block broadcast_346 stored as values in memory (estimated size 13.3 KiB, free 1918.3 MiB)
18:46:34.725 INFO MemoryStore - Block broadcast_346_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.3 MiB)
18:46:34.726 INFO BlockManagerInfo - Added broadcast_346_piece0 in memory on localhost:45727 (size: 8.3 KiB, free: 1919.7 MiB)
18:46:34.726 INFO SparkContext - Created broadcast 346 from broadcast at BamSource.java:104
18:46:34.727 INFO MemoryStore - Block broadcast_347 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
18:46:34.733 INFO MemoryStore - Block broadcast_347_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:34.733 INFO BlockManagerInfo - Added broadcast_347_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:34.733 INFO SparkContext - Created broadcast 347 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.742 INFO FileInputFormat - Total input files to process : 1
18:46:34.756 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:34.756 INFO DAGScheduler - Got job 130 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:34.756 INFO DAGScheduler - Final stage: ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:34.756 INFO DAGScheduler - Parents of final stage: List()
18:46:34.756 INFO DAGScheduler - Missing parents: List()
18:46:34.756 INFO DAGScheduler - Submitting ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:34.762 INFO MemoryStore - Block broadcast_348 stored as values in memory (estimated size 148.2 KiB, free 1917.8 MiB)
18:46:34.763 INFO MemoryStore - Block broadcast_348_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
18:46:34.763 INFO BlockManagerInfo - Added broadcast_348_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:34.763 INFO SparkContext - Created broadcast 348 from broadcast at DAGScheduler.scala:1580
18:46:34.763 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:34.763 INFO TaskSchedulerImpl - Adding task set 177.0 with 1 tasks resource profile 0
18:46:34.764 INFO TaskSetManager - Starting task 0.0 in stage 177.0 (TID 233) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:34.764 INFO Executor - Running task 0.0 in stage 177.0 (TID 233)
18:46:34.776 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17804319436065731535.bam:0+237038
18:46:34.780 INFO Executor - Finished task 0.0 in stage 177.0 (TID 233). 651483 bytes result sent to driver
18:46:34.782 INFO TaskSetManager - Finished task 0.0 in stage 177.0 (TID 233) in 18 ms on localhost (executor driver) (1/1)
18:46:34.782 INFO TaskSchedulerImpl - Removed TaskSet 177.0, whose tasks have all completed, from pool
18:46:34.783 INFO DAGScheduler - ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
18:46:34.783 INFO DAGScheduler - Job 130 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.783 INFO TaskSchedulerImpl - Killing all running tasks in stage 177: Stage finished
18:46:34.783 INFO DAGScheduler - Job 130 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026940 s
18:46:34.792 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:34.792 INFO DAGScheduler - Got job 131 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:34.792 INFO DAGScheduler - Final stage: ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185)
18:46:34.792 INFO DAGScheduler - Parents of final stage: List()
18:46:34.792 INFO DAGScheduler - Missing parents: List()
18:46:34.792 INFO DAGScheduler - Submitting ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:34.809 INFO MemoryStore - Block broadcast_349 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
18:46:34.811 INFO MemoryStore - Block broadcast_349_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
18:46:34.811 INFO BlockManagerInfo - Added broadcast_349_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:34.811 INFO SparkContext - Created broadcast 349 from broadcast at DAGScheduler.scala:1580
18:46:34.811 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:34.811 INFO TaskSchedulerImpl - Adding task set 178.0 with 1 tasks resource profile 0
18:46:34.812 INFO TaskSetManager - Starting task 0.0 in stage 178.0 (TID 234) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:34.812 INFO Executor - Running task 0.0 in stage 178.0 (TID 234)
18:46:34.840 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:34.849 INFO Executor - Finished task 0.0 in stage 178.0 (TID 234). 989 bytes result sent to driver
18:46:34.850 INFO TaskSetManager - Finished task 0.0 in stage 178.0 (TID 234) in 39 ms on localhost (executor driver) (1/1)
18:46:34.850 INFO TaskSchedulerImpl - Removed TaskSet 178.0, whose tasks have all completed, from pool
18:46:34.850 INFO DAGScheduler - ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
18:46:34.850 INFO DAGScheduler - Job 131 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.850 INFO TaskSchedulerImpl - Killing all running tasks in stage 178: Stage finished
18:46:34.850 INFO DAGScheduler - Job 131 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057943 s
18:46:34.853 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:34.854 INFO DAGScheduler - Got job 132 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:34.854 INFO DAGScheduler - Final stage: ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185)
18:46:34.854 INFO DAGScheduler - Parents of final stage: List()
18:46:34.854 INFO DAGScheduler - Missing parents: List()
18:46:34.854 INFO DAGScheduler - Submitting ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:34.860 INFO MemoryStore - Block broadcast_350 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
18:46:34.860 INFO MemoryStore - Block broadcast_350_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.0 MiB)
18:46:34.861 INFO BlockManagerInfo - Added broadcast_350_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.4 MiB)
18:46:34.861 INFO SparkContext - Created broadcast 350 from broadcast at DAGScheduler.scala:1580
18:46:34.861 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:34.861 INFO TaskSchedulerImpl - Adding task set 179.0 with 1 tasks resource profile 0
18:46:34.861 INFO TaskSetManager - Starting task 0.0 in stage 179.0 (TID 235) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:34.861 INFO Executor - Running task 0.0 in stage 179.0 (TID 235)
18:46:34.872 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17804319436065731535.bam:0+237038
18:46:34.875 INFO Executor - Finished task 0.0 in stage 179.0 (TID 235). 989 bytes result sent to driver
18:46:34.875 INFO TaskSetManager - Finished task 0.0 in stage 179.0 (TID 235) in 14 ms on localhost (executor driver) (1/1)
18:46:34.875 INFO TaskSchedulerImpl - Removed TaskSet 179.0, whose tasks have all completed, from pool
18:46:34.876 INFO DAGScheduler - ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
18:46:34.876 INFO DAGScheduler - Job 132 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:34.876 INFO TaskSchedulerImpl - Killing all running tasks in stage 179: Stage finished
18:46:34.877 INFO DAGScheduler - Job 132 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.023377 s
18:46:34.884 INFO MemoryStore - Block broadcast_351 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
18:46:34.890 INFO MemoryStore - Block broadcast_351_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
18:46:34.891 INFO BlockManagerInfo - Added broadcast_351_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:34.891 INFO SparkContext - Created broadcast 351 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.912 INFO MemoryStore - Block broadcast_352 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:34.918 INFO MemoryStore - Block broadcast_352_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:34.918 INFO BlockManagerInfo - Added broadcast_352_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:34.918 INFO SparkContext - Created broadcast 352 from newAPIHadoopFile at PathSplitSource.java:96
18:46:34.937 INFO FileInputFormat - Total input files to process : 1
18:46:34.939 INFO MemoryStore - Block broadcast_353 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
18:46:34.940 INFO MemoryStore - Block broadcast_353_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
18:46:34.940 INFO BlockManagerInfo - Added broadcast_353_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:34.940 INFO SparkContext - Created broadcast 353 from broadcast at ReadsSparkSink.java:133
18:46:34.940 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:34.941 INFO MemoryStore - Block broadcast_354 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
18:46:34.942 INFO MemoryStore - Block broadcast_354_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
18:46:34.942 INFO BlockManagerInfo - Added broadcast_354_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:34.942 INFO SparkContext - Created broadcast 354 from broadcast at BamSink.java:76
18:46:34.943 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:34.944 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:34.944 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:34.959 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:34.960 INFO DAGScheduler - Registering RDD 853 (mapToPair at SparkUtils.java:161) as input to shuffle 36
18:46:34.960 INFO DAGScheduler - Got job 133 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:34.960 INFO DAGScheduler - Final stage: ResultStage 181 (runJob at SparkHadoopWriter.scala:83)
18:46:34.960 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 180)
18:46:34.960 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 180)
18:46:34.960 INFO DAGScheduler - Submitting ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:34.977 INFO MemoryStore - Block broadcast_355 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
18:46:34.979 INFO MemoryStore - Block broadcast_355_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.3 MiB)
18:46:34.979 INFO BlockManagerInfo - Added broadcast_355_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.1 MiB)
18:46:34.979 INFO SparkContext - Created broadcast 355 from broadcast at DAGScheduler.scala:1580
18:46:34.979 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:34.979 INFO TaskSchedulerImpl - Adding task set 180.0 with 1 tasks resource profile 0
18:46:34.980 INFO TaskSetManager - Starting task 0.0 in stage 180.0 (TID 236) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:34.980 INFO Executor - Running task 0.0 in stage 180.0 (TID 236)
18:46:35.009 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:35.024 INFO Executor - Finished task 0.0 in stage 180.0 (TID 236). 1148 bytes result sent to driver
18:46:35.024 INFO TaskSetManager - Finished task 0.0 in stage 180.0 (TID 236) in 45 ms on localhost (executor driver) (1/1)
18:46:35.024 INFO TaskSchedulerImpl - Removed TaskSet 180.0, whose tasks have all completed, from pool
18:46:35.024 INFO DAGScheduler - ShuffleMapStage 180 (mapToPair at SparkUtils.java:161) finished in 0.064 s
18:46:35.024 INFO DAGScheduler - looking for newly runnable stages
18:46:35.024 INFO DAGScheduler - running: HashSet()
18:46:35.024 INFO DAGScheduler - waiting: HashSet(ResultStage 181)
18:46:35.025 INFO DAGScheduler - failed: HashSet()
18:46:35.025 INFO DAGScheduler - Submitting ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91), which has no missing parents
18:46:35.031 INFO MemoryStore - Block broadcast_356 stored as values in memory (estimated size 241.4 KiB, free 1915.1 MiB)
18:46:35.032 INFO MemoryStore - Block broadcast_356_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.0 MiB)
18:46:35.032 INFO BlockManagerInfo - Added broadcast_356_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.0 MiB)
18:46:35.032 INFO SparkContext - Created broadcast 356 from broadcast at DAGScheduler.scala:1580
18:46:35.033 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:35.033 INFO TaskSchedulerImpl - Adding task set 181.0 with 1 tasks resource profile 0
18:46:35.033 INFO TaskSetManager - Starting task 0.0 in stage 181.0 (TID 237) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:35.033 INFO Executor - Running task 0.0 in stage 181.0 (TID 237)
18:46:35.037 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:35.037 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:35.049 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.049 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.049 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.050 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.050 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.050 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.069 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846347962943699682338715_0858_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace63808345915256651/_temporary/0/task_202505191846347962943699682338715_0858_r_000000
18:46:35.069 INFO SparkHadoopMapRedUtil - attempt_202505191846347962943699682338715_0858_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:35.070 INFO Executor - Finished task 0.0 in stage 181.0 (TID 237). 1858 bytes result sent to driver
18:46:35.070 INFO TaskSetManager - Finished task 0.0 in stage 181.0 (TID 237) in 37 ms on localhost (executor driver) (1/1)
18:46:35.070 INFO TaskSchedulerImpl - Removed TaskSet 181.0, whose tasks have all completed, from pool
18:46:35.070 INFO DAGScheduler - ResultStage 181 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
18:46:35.070 INFO DAGScheduler - Job 133 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.070 INFO TaskSchedulerImpl - Killing all running tasks in stage 181: Stage finished
18:46:35.070 INFO DAGScheduler - Job 133 finished: runJob at SparkHadoopWriter.scala:83, took 0.110866 s
18:46:35.071 INFO SparkHadoopWriter - Start to commit write Job job_202505191846347962943699682338715_0858.
18:46:35.075 INFO SparkHadoopWriter - Write Job job_202505191846347962943699682338715_0858 committed. Elapsed time: 4 ms.
18:46:35.086 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest111026487237990180348.bam
18:46:35.091 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest111026487237990180348.bam done
18:46:35.091 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace63808345915256651 to /tmp/ReadsSparkSinkUnitTest111026487237990180348.bam.bai
18:46:35.095 INFO IndexFileMerger - Done merging .bai files
18:46:35.098 INFO MemoryStore - Block broadcast_357 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
18:46:35.104 INFO MemoryStore - Block broadcast_357_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1914.7 MiB)
18:46:35.104 INFO BlockManagerInfo - Added broadcast_357_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.0 MiB)
18:46:35.105 INFO SparkContext - Created broadcast 357 from newAPIHadoopFile at PathSplitSource.java:96
18:46:35.124 INFO FileInputFormat - Total input files to process : 1
18:46:35.159 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:35.159 INFO DAGScheduler - Got job 134 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:35.159 INFO DAGScheduler - Final stage: ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:35.159 INFO DAGScheduler - Parents of final stage: List()
18:46:35.159 INFO DAGScheduler - Missing parents: List()
18:46:35.160 INFO DAGScheduler - Submitting ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:35.176 INFO MemoryStore - Block broadcast_358 stored as values in memory (estimated size 426.2 KiB, free 1914.3 MiB)
18:46:35.181 INFO BlockManagerInfo - Removed broadcast_352_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.0 MiB)
18:46:35.181 INFO BlockManagerInfo - Removed broadcast_350_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.1 MiB)
18:46:35.181 INFO MemoryStore - Block broadcast_358_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1914.7 MiB)
18:46:35.182 INFO BlockManagerInfo - Added broadcast_358_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1918.9 MiB)
18:46:35.182 INFO SparkContext - Created broadcast 358 from broadcast at DAGScheduler.scala:1580
18:46:35.182 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:35.182 INFO TaskSchedulerImpl - Adding task set 182.0 with 1 tasks resource profile 0
18:46:35.182 INFO BlockManagerInfo - Removed broadcast_342_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.0 MiB)
18:46:35.183 INFO TaskSetManager - Starting task 0.0 in stage 182.0 (TID 238) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:35.183 INFO BlockManagerInfo - Removed broadcast_346_piece0 on localhost:45727 in memory (size: 8.3 KiB, free: 1919.0 MiB)
18:46:35.183 INFO Executor - Running task 0.0 in stage 182.0 (TID 238)
18:46:35.183 INFO BlockManagerInfo - Removed broadcast_344_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.1 MiB)
18:46:35.184 INFO BlockManagerInfo - Removed broadcast_349_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:35.185 INFO BlockManagerInfo - Removed broadcast_347_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:35.186 INFO BlockManagerInfo - Removed broadcast_343_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:35.186 INFO BlockManagerInfo - Removed broadcast_354_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:35.188 INFO BlockManagerInfo - Removed broadcast_355_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.5 MiB)
18:46:35.188 INFO BlockManagerInfo - Removed broadcast_345_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.6 MiB)
18:46:35.189 INFO BlockManagerInfo - Removed broadcast_348_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.6 MiB)
18:46:35.189 INFO BlockManagerInfo - Removed broadcast_356_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.7 MiB)
18:46:35.190 INFO BlockManagerInfo - Removed broadcast_353_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:35.190 INFO BlockManagerInfo - Removed broadcast_340_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:35.219 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111026487237990180348.bam:0+237038
18:46:35.231 INFO Executor - Finished task 0.0 in stage 182.0 (TID 238). 651483 bytes result sent to driver
18:46:35.232 INFO TaskSetManager - Finished task 0.0 in stage 182.0 (TID 238) in 50 ms on localhost (executor driver) (1/1)
18:46:35.232 INFO TaskSchedulerImpl - Removed TaskSet 182.0, whose tasks have all completed, from pool
18:46:35.233 INFO DAGScheduler - ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.073 s
18:46:35.233 INFO DAGScheduler - Job 134 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.233 INFO TaskSchedulerImpl - Killing all running tasks in stage 182: Stage finished
18:46:35.233 INFO DAGScheduler - Job 134 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.073672 s
18:46:35.242 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:35.242 INFO DAGScheduler - Got job 135 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:35.242 INFO DAGScheduler - Final stage: ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185)
18:46:35.242 INFO DAGScheduler - Parents of final stage: List()
18:46:35.242 INFO DAGScheduler - Missing parents: List()
18:46:35.243 INFO DAGScheduler - Submitting ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:35.260 INFO MemoryStore - Block broadcast_359 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
18:46:35.261 INFO MemoryStore - Block broadcast_359_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
18:46:35.261 INFO BlockManagerInfo - Added broadcast_359_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.6 MiB)
18:46:35.261 INFO SparkContext - Created broadcast 359 from broadcast at DAGScheduler.scala:1580
18:46:35.261 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:35.261 INFO TaskSchedulerImpl - Adding task set 183.0 with 1 tasks resource profile 0
18:46:35.262 INFO TaskSetManager - Starting task 0.0 in stage 183.0 (TID 239) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:35.262 INFO Executor - Running task 0.0 in stage 183.0 (TID 239)
18:46:35.291 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:35.301 INFO Executor - Finished task 0.0 in stage 183.0 (TID 239). 989 bytes result sent to driver
18:46:35.301 INFO TaskSetManager - Finished task 0.0 in stage 183.0 (TID 239) in 39 ms on localhost (executor driver) (1/1)
18:46:35.301 INFO TaskSchedulerImpl - Removed TaskSet 183.0, whose tasks have all completed, from pool
18:46:35.301 INFO DAGScheduler - ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
18:46:35.301 INFO DAGScheduler - Job 135 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.301 INFO TaskSchedulerImpl - Killing all running tasks in stage 183: Stage finished
18:46:35.301 INFO DAGScheduler - Job 135 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059232 s
18:46:35.305 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:35.305 INFO DAGScheduler - Got job 136 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:35.305 INFO DAGScheduler - Final stage: ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185)
18:46:35.305 INFO DAGScheduler - Parents of final stage: List()
18:46:35.305 INFO DAGScheduler - Missing parents: List()
18:46:35.305 INFO DAGScheduler - Submitting ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:35.322 INFO MemoryStore - Block broadcast_360 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
18:46:35.323 INFO MemoryStore - Block broadcast_360_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
18:46:35.323 INFO BlockManagerInfo - Added broadcast_360_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:35.323 INFO SparkContext - Created broadcast 360 from broadcast at DAGScheduler.scala:1580
18:46:35.323 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:35.323 INFO TaskSchedulerImpl - Adding task set 184.0 with 1 tasks resource profile 0
18:46:35.324 INFO TaskSetManager - Starting task 0.0 in stage 184.0 (TID 240) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:35.324 INFO Executor - Running task 0.0 in stage 184.0 (TID 240)
18:46:35.352 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111026487237990180348.bam:0+237038
18:46:35.364 INFO Executor - Finished task 0.0 in stage 184.0 (TID 240). 989 bytes result sent to driver
18:46:35.364 INFO TaskSetManager - Finished task 0.0 in stage 184.0 (TID 240) in 40 ms on localhost (executor driver) (1/1)
18:46:35.364 INFO TaskSchedulerImpl - Removed TaskSet 184.0, whose tasks have all completed, from pool
18:46:35.364 INFO DAGScheduler - ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
18:46:35.364 INFO DAGScheduler - Job 136 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.364 INFO TaskSchedulerImpl - Killing all running tasks in stage 184: Stage finished
18:46:35.364 INFO DAGScheduler - Job 136 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059701 s
18:46:35.373 INFO MemoryStore - Block broadcast_361 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
18:46:35.383 INFO MemoryStore - Block broadcast_361_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
18:46:35.383 INFO BlockManagerInfo - Added broadcast_361_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:35.383 INFO SparkContext - Created broadcast 361 from newAPIHadoopFile at PathSplitSource.java:96
18:46:35.405 INFO MemoryStore - Block broadcast_362 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
18:46:35.412 INFO MemoryStore - Block broadcast_362_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
18:46:35.412 INFO BlockManagerInfo - Added broadcast_362_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:35.412 INFO SparkContext - Created broadcast 362 from newAPIHadoopFile at PathSplitSource.java:96
18:46:35.432 INFO FileInputFormat - Total input files to process : 1
18:46:35.433 INFO MemoryStore - Block broadcast_363 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
18:46:35.434 INFO MemoryStore - Block broadcast_363_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
18:46:35.434 INFO BlockManagerInfo - Added broadcast_363_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:35.434 INFO SparkContext - Created broadcast 363 from broadcast at ReadsSparkSink.java:133
18:46:35.435 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:35.436 INFO MemoryStore - Block broadcast_364 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
18:46:35.437 INFO MemoryStore - Block broadcast_364_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
18:46:35.437 INFO BlockManagerInfo - Added broadcast_364_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:35.437 INFO SparkContext - Created broadcast 364 from broadcast at BamSink.java:76
18:46:35.439 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.439 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.439 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.460 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:35.461 INFO DAGScheduler - Registering RDD 879 (mapToPair at SparkUtils.java:161) as input to shuffle 37
18:46:35.461 INFO DAGScheduler - Got job 137 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:35.461 INFO DAGScheduler - Final stage: ResultStage 186 (runJob at SparkHadoopWriter.scala:83)
18:46:35.461 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 185)
18:46:35.461 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 185)
18:46:35.461 INFO DAGScheduler - Submitting ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:35.478 INFO MemoryStore - Block broadcast_365 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
18:46:35.480 INFO MemoryStore - Block broadcast_365_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
18:46:35.480 INFO BlockManagerInfo - Added broadcast_365_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.2 MiB)
18:46:35.480 INFO SparkContext - Created broadcast 365 from broadcast at DAGScheduler.scala:1580
18:46:35.480 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:35.480 INFO TaskSchedulerImpl - Adding task set 185.0 with 1 tasks resource profile 0
18:46:35.481 INFO TaskSetManager - Starting task 0.0 in stage 185.0 (TID 241) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:35.481 INFO Executor - Running task 0.0 in stage 185.0 (TID 241)
18:46:35.510 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:35.526 INFO Executor - Finished task 0.0 in stage 185.0 (TID 241). 1148 bytes result sent to driver
18:46:35.526 INFO TaskSetManager - Finished task 0.0 in stage 185.0 (TID 241) in 46 ms on localhost (executor driver) (1/1)
18:46:35.526 INFO TaskSchedulerImpl - Removed TaskSet 185.0, whose tasks have all completed, from pool
18:46:35.526 INFO DAGScheduler - ShuffleMapStage 185 (mapToPair at SparkUtils.java:161) finished in 0.065 s
18:46:35.526 INFO DAGScheduler - looking for newly runnable stages
18:46:35.526 INFO DAGScheduler - running: HashSet()
18:46:35.526 INFO DAGScheduler - waiting: HashSet(ResultStage 186)
18:46:35.526 INFO DAGScheduler - failed: HashSet()
18:46:35.526 INFO DAGScheduler - Submitting ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91), which has no missing parents
18:46:35.533 INFO MemoryStore - Block broadcast_366 stored as values in memory (estimated size 241.4 KiB, free 1915.7 MiB)
18:46:35.534 INFO MemoryStore - Block broadcast_366_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.6 MiB)
18:46:35.534 INFO BlockManagerInfo - Added broadcast_366_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.1 MiB)
18:46:35.534 INFO SparkContext - Created broadcast 366 from broadcast at DAGScheduler.scala:1580
18:46:35.534 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:35.535 INFO TaskSchedulerImpl - Adding task set 186.0 with 1 tasks resource profile 0
18:46:35.535 INFO TaskSetManager - Starting task 0.0 in stage 186.0 (TID 242) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:35.535 INFO Executor - Running task 0.0 in stage 186.0 (TID 242)
18:46:35.542 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:35.542 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:35.555 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.555 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.555 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.555 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.555 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.555 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.573 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846355395230591324669483_0884_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace541698172963714276/_temporary/0/task_202505191846355395230591324669483_0884_r_000000
18:46:35.573 INFO SparkHadoopMapRedUtil - attempt_202505191846355395230591324669483_0884_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:35.574 INFO Executor - Finished task 0.0 in stage 186.0 (TID 242). 1858 bytes result sent to driver
18:46:35.574 INFO TaskSetManager - Finished task 0.0 in stage 186.0 (TID 242) in 39 ms on localhost (executor driver) (1/1)
18:46:35.574 INFO TaskSchedulerImpl - Removed TaskSet 186.0, whose tasks have all completed, from pool
18:46:35.574 INFO DAGScheduler - ResultStage 186 (runJob at SparkHadoopWriter.scala:83) finished in 0.047 s
18:46:35.574 INFO DAGScheduler - Job 137 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.574 INFO TaskSchedulerImpl - Killing all running tasks in stage 186: Stage finished
18:46:35.575 INFO DAGScheduler - Job 137 finished: runJob at SparkHadoopWriter.scala:83, took 0.114244 s
18:46:35.575 INFO SparkHadoopWriter - Start to commit write Job job_202505191846355395230591324669483_0884.
18:46:35.579 INFO SparkHadoopWriter - Write Job job_202505191846355395230591324669483_0884 committed. Elapsed time: 4 ms.
18:46:35.590 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest112449949559382467309.bam
18:46:35.594 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest112449949559382467309.bam done
18:46:35.594 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace541698172963714276 to /tmp/ReadsSparkSinkUnitTest112449949559382467309.bam.sbi
18:46:35.598 INFO IndexFileMerger - Done merging .sbi files
18:46:35.600 INFO MemoryStore - Block broadcast_367 stored as values in memory (estimated size 320.0 B, free 1915.6 MiB)
18:46:35.600 INFO MemoryStore - Block broadcast_367_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.6 MiB)
18:46:35.600 INFO BlockManagerInfo - Added broadcast_367_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.1 MiB)
18:46:35.601 INFO SparkContext - Created broadcast 367 from broadcast at BamSource.java:104
18:46:35.602 INFO MemoryStore - Block broadcast_368 stored as values in memory (estimated size 297.9 KiB, free 1915.3 MiB)
18:46:35.608 INFO MemoryStore - Block broadcast_368_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
18:46:35.608 INFO BlockManagerInfo - Added broadcast_368_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.1 MiB)
18:46:35.608 INFO SparkContext - Created broadcast 368 from newAPIHadoopFile at PathSplitSource.java:96
18:46:35.617 INFO FileInputFormat - Total input files to process : 1
18:46:35.631 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:35.631 INFO DAGScheduler - Got job 138 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:35.631 INFO DAGScheduler - Final stage: ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:35.631 INFO DAGScheduler - Parents of final stage: List()
18:46:35.631 INFO DAGScheduler - Missing parents: List()
18:46:35.631 INFO DAGScheduler - Submitting ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:35.637 INFO MemoryStore - Block broadcast_369 stored as values in memory (estimated size 148.2 KiB, free 1915.1 MiB)
18:46:35.638 INFO MemoryStore - Block broadcast_369_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.1 MiB)
18:46:35.638 INFO BlockManagerInfo - Added broadcast_369_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.0 MiB)
18:46:35.638 INFO SparkContext - Created broadcast 369 from broadcast at DAGScheduler.scala:1580
18:46:35.638 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:35.638 INFO TaskSchedulerImpl - Adding task set 187.0 with 1 tasks resource profile 0
18:46:35.639 INFO TaskSetManager - Starting task 0.0 in stage 187.0 (TID 243) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:35.639 INFO Executor - Running task 0.0 in stage 187.0 (TID 243)
18:46:35.650 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest112449949559382467309.bam:0+237038
18:46:35.654 INFO Executor - Finished task 0.0 in stage 187.0 (TID 243). 651483 bytes result sent to driver
18:46:35.657 INFO TaskSetManager - Finished task 0.0 in stage 187.0 (TID 243) in 18 ms on localhost (executor driver) (1/1)
18:46:35.657 INFO TaskSchedulerImpl - Removed TaskSet 187.0, whose tasks have all completed, from pool
18:46:35.657 INFO DAGScheduler - ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
18:46:35.657 INFO DAGScheduler - Job 138 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.657 INFO TaskSchedulerImpl - Killing all running tasks in stage 187: Stage finished
18:46:35.657 INFO DAGScheduler - Job 138 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026023 s
18:46:35.666 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:35.666 INFO DAGScheduler - Got job 139 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:35.666 INFO DAGScheduler - Final stage: ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185)
18:46:35.666 INFO DAGScheduler - Parents of final stage: List()
18:46:35.666 INFO DAGScheduler - Missing parents: List()
18:46:35.666 INFO DAGScheduler - Submitting ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:35.683 INFO MemoryStore - Block broadcast_370 stored as values in memory (estimated size 426.1 KiB, free 1914.7 MiB)
18:46:35.684 INFO MemoryStore - Block broadcast_370_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1914.5 MiB)
18:46:35.684 INFO BlockManagerInfo - Added broadcast_370_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1918.9 MiB)
18:46:35.685 INFO SparkContext - Created broadcast 370 from broadcast at DAGScheduler.scala:1580
18:46:35.685 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:35.685 INFO TaskSchedulerImpl - Adding task set 188.0 with 1 tasks resource profile 0
18:46:35.685 INFO TaskSetManager - Starting task 0.0 in stage 188.0 (TID 244) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:35.685 INFO Executor - Running task 0.0 in stage 188.0 (TID 244)
18:46:35.714 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:35.723 INFO Executor - Finished task 0.0 in stage 188.0 (TID 244). 989 bytes result sent to driver
18:46:35.724 INFO TaskSetManager - Finished task 0.0 in stage 188.0 (TID 244) in 38 ms on localhost (executor driver) (1/1)
18:46:35.724 INFO TaskSchedulerImpl - Removed TaskSet 188.0, whose tasks have all completed, from pool
18:46:35.724 INFO DAGScheduler - ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
18:46:35.724 INFO DAGScheduler - Job 139 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.724 INFO TaskSchedulerImpl - Killing all running tasks in stage 188: Stage finished
18:46:35.724 INFO DAGScheduler - Job 139 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057848 s
18:46:35.727 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:35.727 INFO DAGScheduler - Got job 140 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:35.727 INFO DAGScheduler - Final stage: ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185)
18:46:35.727 INFO DAGScheduler - Parents of final stage: List()
18:46:35.727 INFO DAGScheduler - Missing parents: List()
18:46:35.728 INFO DAGScheduler - Submitting ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:35.739 INFO BlockManagerInfo - Removed broadcast_357_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1918.9 MiB)
18:46:35.740 INFO BlockManagerInfo - Removed broadcast_362_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.0 MiB)
18:46:35.740 INFO BlockManagerInfo - Removed broadcast_360_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.1 MiB)
18:46:35.741 INFO BlockManagerInfo - Removed broadcast_366_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.2 MiB)
18:46:35.741 INFO BlockManagerInfo - Removed broadcast_351_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:35.742 INFO BlockManagerInfo - Removed broadcast_358_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:35.742 INFO MemoryStore - Block broadcast_371 stored as values in memory (estimated size 148.1 KiB, free 1916.8 MiB)
18:46:35.742 INFO BlockManagerInfo - Removed broadcast_370_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:35.743 INFO BlockManagerInfo - Removed broadcast_364_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:35.743 INFO MemoryStore - Block broadcast_371_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.5 MiB)
18:46:35.743 INFO BlockManagerInfo - Added broadcast_371_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.5 MiB)
18:46:35.743 INFO SparkContext - Created broadcast 371 from broadcast at DAGScheduler.scala:1580
18:46:35.743 INFO BlockManagerInfo - Removed broadcast_365_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.6 MiB)
18:46:35.743 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:35.743 INFO TaskSchedulerImpl - Adding task set 189.0 with 1 tasks resource profile 0
18:46:35.744 INFO BlockManagerInfo - Removed broadcast_363_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:35.744 INFO BlockManagerInfo - Removed broadcast_359_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.8 MiB)
18:46:35.744 INFO TaskSetManager - Starting task 0.0 in stage 189.0 (TID 245) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:35.744 INFO Executor - Running task 0.0 in stage 189.0 (TID 245)
18:46:35.745 INFO BlockManagerInfo - Removed broadcast_369_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.8 MiB)
18:46:35.760 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest112449949559382467309.bam:0+237038
18:46:35.764 INFO Executor - Finished task 0.0 in stage 189.0 (TID 245). 989 bytes result sent to driver
18:46:35.764 INFO TaskSetManager - Finished task 0.0 in stage 189.0 (TID 245) in 20 ms on localhost (executor driver) (1/1)
18:46:35.764 INFO TaskSchedulerImpl - Removed TaskSet 189.0, whose tasks have all completed, from pool
18:46:35.764 INFO DAGScheduler - ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.036 s
18:46:35.764 INFO DAGScheduler - Job 140 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.764 INFO TaskSchedulerImpl - Killing all running tasks in stage 189: Stage finished
18:46:35.764 INFO DAGScheduler - Job 140 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.037026 s
18:46:35.772 INFO MemoryStore - Block broadcast_372 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
18:46:35.779 INFO MemoryStore - Block broadcast_372_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
18:46:35.779 INFO BlockManagerInfo - Added broadcast_372_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:35.779 INFO SparkContext - Created broadcast 372 from newAPIHadoopFile at PathSplitSource.java:96
18:46:35.801 INFO MemoryStore - Block broadcast_373 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
18:46:35.807 INFO MemoryStore - Block broadcast_373_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
18:46:35.807 INFO BlockManagerInfo - Added broadcast_373_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:35.807 INFO SparkContext - Created broadcast 373 from newAPIHadoopFile at PathSplitSource.java:96
18:46:35.827 INFO FileInputFormat - Total input files to process : 1
18:46:35.829 INFO MemoryStore - Block broadcast_374 stored as values in memory (estimated size 160.7 KiB, free 1918.3 MiB)
18:46:35.829 INFO MemoryStore - Block broadcast_374_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.3 MiB)
18:46:35.829 INFO BlockManagerInfo - Added broadcast_374_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.7 MiB)
18:46:35.830 INFO SparkContext - Created broadcast 374 from broadcast at ReadsSparkSink.java:133
18:46:35.830 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:35.830 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:35.831 INFO MemoryStore - Block broadcast_375 stored as values in memory (estimated size 163.2 KiB, free 1918.1 MiB)
18:46:35.832 INFO MemoryStore - Block broadcast_375_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.1 MiB)
18:46:35.832 INFO BlockManagerInfo - Added broadcast_375_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.7 MiB)
18:46:35.832 INFO SparkContext - Created broadcast 375 from broadcast at BamSink.java:76
18:46:35.833 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.833 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.833 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.851 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:35.851 INFO DAGScheduler - Registering RDD 904 (mapToPair at SparkUtils.java:161) as input to shuffle 38
18:46:35.851 INFO DAGScheduler - Got job 141 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:35.851 INFO DAGScheduler - Final stage: ResultStage 191 (runJob at SparkHadoopWriter.scala:83)
18:46:35.851 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 190)
18:46:35.851 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 190)
18:46:35.851 INFO DAGScheduler - Submitting ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:35.869 INFO MemoryStore - Block broadcast_376 stored as values in memory (estimated size 520.4 KiB, free 1917.6 MiB)
18:46:35.870 INFO MemoryStore - Block broadcast_376_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.4 MiB)
18:46:35.870 INFO BlockManagerInfo - Added broadcast_376_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.6 MiB)
18:46:35.870 INFO SparkContext - Created broadcast 376 from broadcast at DAGScheduler.scala:1580
18:46:35.870 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:35.870 INFO TaskSchedulerImpl - Adding task set 190.0 with 1 tasks resource profile 0
18:46:35.871 INFO TaskSetManager - Starting task 0.0 in stage 190.0 (TID 246) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:35.871 INFO Executor - Running task 0.0 in stage 190.0 (TID 246)
18:46:35.901 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:35.917 INFO Executor - Finished task 0.0 in stage 190.0 (TID 246). 1148 bytes result sent to driver
18:46:35.917 INFO TaskSetManager - Finished task 0.0 in stage 190.0 (TID 246) in 46 ms on localhost (executor driver) (1/1)
18:46:35.917 INFO TaskSchedulerImpl - Removed TaskSet 190.0, whose tasks have all completed, from pool
18:46:35.917 INFO DAGScheduler - ShuffleMapStage 190 (mapToPair at SparkUtils.java:161) finished in 0.065 s
18:46:35.917 INFO DAGScheduler - looking for newly runnable stages
18:46:35.917 INFO DAGScheduler - running: HashSet()
18:46:35.917 INFO DAGScheduler - waiting: HashSet(ResultStage 191)
18:46:35.917 INFO DAGScheduler - failed: HashSet()
18:46:35.917 INFO DAGScheduler - Submitting ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91), which has no missing parents
18:46:35.924 INFO MemoryStore - Block broadcast_377 stored as values in memory (estimated size 241.4 KiB, free 1917.2 MiB)
18:46:35.925 INFO MemoryStore - Block broadcast_377_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1917.1 MiB)
18:46:35.925 INFO BlockManagerInfo - Added broadcast_377_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.5 MiB)
18:46:35.925 INFO SparkContext - Created broadcast 377 from broadcast at DAGScheduler.scala:1580
18:46:35.925 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:35.925 INFO TaskSchedulerImpl - Adding task set 191.0 with 1 tasks resource profile 0
18:46:35.926 INFO TaskSetManager - Starting task 0.0 in stage 191.0 (TID 247) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:35.926 INFO Executor - Running task 0.0 in stage 191.0 (TID 247)
18:46:35.930 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:35.930 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:35.941 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.941 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.941 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.941 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:35.941 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:35.941 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:35.955 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846351530344374264266799_0909_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace13888288284928534505/_temporary/0/task_202505191846351530344374264266799_0909_r_000000
18:46:35.955 INFO SparkHadoopMapRedUtil - attempt_202505191846351530344374264266799_0909_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:35.956 INFO Executor - Finished task 0.0 in stage 191.0 (TID 247). 1858 bytes result sent to driver
18:46:35.956 INFO TaskSetManager - Finished task 0.0 in stage 191.0 (TID 247) in 30 ms on localhost (executor driver) (1/1)
18:46:35.956 INFO TaskSchedulerImpl - Removed TaskSet 191.0, whose tasks have all completed, from pool
18:46:35.956 INFO DAGScheduler - ResultStage 191 (runJob at SparkHadoopWriter.scala:83) finished in 0.038 s
18:46:35.956 INFO DAGScheduler - Job 141 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:35.956 INFO TaskSchedulerImpl - Killing all running tasks in stage 191: Stage finished
18:46:35.956 INFO DAGScheduler - Job 141 finished: runJob at SparkHadoopWriter.scala:83, took 0.105655 s
18:46:35.956 INFO SparkHadoopWriter - Start to commit write Job job_202505191846351530344374264266799_0909.
18:46:35.961 INFO SparkHadoopWriter - Write Job job_202505191846351530344374264266799_0909 committed. Elapsed time: 4 ms.
18:46:35.972 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest110544081112508927603.bam
18:46:35.976 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest110544081112508927603.bam done
18:46:35.978 INFO MemoryStore - Block broadcast_378 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
18:46:35.987 INFO MemoryStore - Block broadcast_378_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
18:46:35.987 INFO BlockManagerInfo - Added broadcast_378_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:35.988 INFO SparkContext - Created broadcast 378 from newAPIHadoopFile at PathSplitSource.java:96
18:46:36.007 INFO FileInputFormat - Total input files to process : 1
18:46:36.042 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:36.042 INFO DAGScheduler - Got job 142 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:36.043 INFO DAGScheduler - Final stage: ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:36.043 INFO DAGScheduler - Parents of final stage: List()
18:46:36.043 INFO DAGScheduler - Missing parents: List()
18:46:36.043 INFO DAGScheduler - Submitting ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.061 INFO MemoryStore - Block broadcast_379 stored as values in memory (estimated size 426.2 KiB, free 1916.4 MiB)
18:46:36.062 INFO MemoryStore - Block broadcast_379_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1916.2 MiB)
18:46:36.062 INFO BlockManagerInfo - Added broadcast_379_piece0 in memory on localhost:45727 (size: 153.7 KiB, free: 1919.3 MiB)
18:46:36.062 INFO SparkContext - Created broadcast 379 from broadcast at DAGScheduler.scala:1580
18:46:36.063 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.063 INFO TaskSchedulerImpl - Adding task set 192.0 with 1 tasks resource profile 0
18:46:36.063 INFO TaskSetManager - Starting task 0.0 in stage 192.0 (TID 248) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:36.063 INFO Executor - Running task 0.0 in stage 192.0 (TID 248)
18:46:36.097 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest110544081112508927603.bam:0+237038
18:46:36.109 INFO Executor - Finished task 0.0 in stage 192.0 (TID 248). 651483 bytes result sent to driver
18:46:36.112 INFO TaskSetManager - Finished task 0.0 in stage 192.0 (TID 248) in 48 ms on localhost (executor driver) (1/1)
18:46:36.112 INFO TaskSchedulerImpl - Removed TaskSet 192.0, whose tasks have all completed, from pool
18:46:36.112 INFO DAGScheduler - ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.069 s
18:46:36.112 INFO DAGScheduler - Job 142 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.112 INFO TaskSchedulerImpl - Killing all running tasks in stage 192: Stage finished
18:46:36.112 INFO DAGScheduler - Job 142 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.069671 s
18:46:36.122 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:36.122 INFO DAGScheduler - Got job 143 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:36.122 INFO DAGScheduler - Final stage: ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185)
18:46:36.122 INFO DAGScheduler - Parents of final stage: List()
18:46:36.122 INFO DAGScheduler - Missing parents: List()
18:46:36.122 INFO DAGScheduler - Submitting ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.139 INFO MemoryStore - Block broadcast_380 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
18:46:36.140 INFO MemoryStore - Block broadcast_380_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
18:46:36.141 INFO BlockManagerInfo - Added broadcast_380_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.2 MiB)
18:46:36.141 INFO SparkContext - Created broadcast 380 from broadcast at DAGScheduler.scala:1580
18:46:36.141 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.141 INFO TaskSchedulerImpl - Adding task set 193.0 with 1 tasks resource profile 0
18:46:36.141 INFO TaskSetManager - Starting task 0.0 in stage 193.0 (TID 249) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:36.142 INFO Executor - Running task 0.0 in stage 193.0 (TID 249)
18:46:36.170 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:36.179 INFO Executor - Finished task 0.0 in stage 193.0 (TID 249). 989 bytes result sent to driver
18:46:36.179 INFO TaskSetManager - Finished task 0.0 in stage 193.0 (TID 249) in 38 ms on localhost (executor driver) (1/1)
18:46:36.180 INFO TaskSchedulerImpl - Removed TaskSet 193.0, whose tasks have all completed, from pool
18:46:36.180 INFO DAGScheduler - ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
18:46:36.180 INFO DAGScheduler - Job 143 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.180 INFO TaskSchedulerImpl - Killing all running tasks in stage 193: Stage finished
18:46:36.180 INFO DAGScheduler - Job 143 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058207 s
18:46:36.184 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:36.185 INFO DAGScheduler - Got job 144 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:36.185 INFO DAGScheduler - Final stage: ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185)
18:46:36.185 INFO DAGScheduler - Parents of final stage: List()
18:46:36.185 INFO DAGScheduler - Missing parents: List()
18:46:36.185 INFO DAGScheduler - Submitting ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.201 INFO MemoryStore - Block broadcast_381 stored as values in memory (estimated size 426.1 KiB, free 1915.2 MiB)
18:46:36.203 INFO MemoryStore - Block broadcast_381_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
18:46:36.203 INFO BlockManagerInfo - Added broadcast_381_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.0 MiB)
18:46:36.203 INFO SparkContext - Created broadcast 381 from broadcast at DAGScheduler.scala:1580
18:46:36.203 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.203 INFO TaskSchedulerImpl - Adding task set 194.0 with 1 tasks resource profile 0
18:46:36.204 INFO TaskSetManager - Starting task 0.0 in stage 194.0 (TID 250) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:36.204 INFO Executor - Running task 0.0 in stage 194.0 (TID 250)
18:46:36.233 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest110544081112508927603.bam:0+237038
18:46:36.244 INFO Executor - Finished task 0.0 in stage 194.0 (TID 250). 989 bytes result sent to driver
18:46:36.245 INFO TaskSetManager - Finished task 0.0 in stage 194.0 (TID 250) in 41 ms on localhost (executor driver) (1/1)
18:46:36.245 INFO TaskSchedulerImpl - Removed TaskSet 194.0, whose tasks have all completed, from pool
18:46:36.245 INFO DAGScheduler - ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
18:46:36.245 INFO DAGScheduler - Job 144 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.245 INFO TaskSchedulerImpl - Killing all running tasks in stage 194: Stage finished
18:46:36.245 INFO DAGScheduler - Job 144 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060678 s
18:46:36.254 INFO MemoryStore - Block broadcast_382 stored as values in memory (estimated size 298.0 KiB, free 1914.8 MiB)
18:46:36.260 INFO MemoryStore - Block broadcast_382_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1914.8 MiB)
18:46:36.260 INFO BlockManagerInfo - Added broadcast_382_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.0 MiB)
18:46:36.260 INFO SparkContext - Created broadcast 382 from newAPIHadoopFile at PathSplitSource.java:96
18:46:36.281 INFO MemoryStore - Block broadcast_383 stored as values in memory (estimated size 298.0 KiB, free 1914.5 MiB)
18:46:36.288 INFO MemoryStore - Block broadcast_383_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1914.4 MiB)
18:46:36.288 INFO BlockManagerInfo - Added broadcast_383_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1918.9 MiB)
18:46:36.288 INFO SparkContext - Created broadcast 383 from newAPIHadoopFile at PathSplitSource.java:96
18:46:36.308 INFO FileInputFormat - Total input files to process : 1
18:46:36.309 INFO MemoryStore - Block broadcast_384 stored as values in memory (estimated size 160.7 KiB, free 1914.3 MiB)
18:46:36.315 INFO MemoryStore - Block broadcast_384_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1914.3 MiB)
18:46:36.315 INFO BlockManagerInfo - Added broadcast_384_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1918.9 MiB)
18:46:36.315 INFO BlockManagerInfo - Removed broadcast_379_piece0 on localhost:45727 in memory (size: 153.7 KiB, free: 1919.0 MiB)
18:46:36.316 INFO SparkContext - Created broadcast 384 from broadcast at ReadsSparkSink.java:133
18:46:36.317 INFO BlockManagerInfo - Removed broadcast_377_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.1 MiB)
18:46:36.317 INFO BlockManagerInfo - Removed broadcast_367_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.1 MiB)
18:46:36.317 INFO BlockManagerInfo - Removed broadcast_368_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:36.317 INFO MemoryStore - Block broadcast_385 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
18:46:36.318 INFO BlockManagerInfo - Removed broadcast_371_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.2 MiB)
18:46:36.318 INFO MemoryStore - Block broadcast_385_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
18:46:36.318 INFO BlockManagerInfo - Removed broadcast_383_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.3 MiB)
18:46:36.318 INFO BlockManagerInfo - Added broadcast_385_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:36.318 INFO SparkContext - Created broadcast 385 from broadcast at BamSink.java:76
18:46:36.319 INFO BlockManagerInfo - Removed broadcast_373_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:36.319 INFO BlockManagerInfo - Removed broadcast_378_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:36.319 INFO BlockManagerInfo - Removed broadcast_380_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:36.320 INFO BlockManagerInfo - Removed broadcast_375_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:36.321 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:36.321 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:36.321 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:36.321 INFO BlockManagerInfo - Removed broadcast_361_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:36.321 INFO BlockManagerInfo - Removed broadcast_376_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.7 MiB)
18:46:36.322 INFO BlockManagerInfo - Removed broadcast_372_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:36.323 INFO BlockManagerInfo - Removed broadcast_374_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:36.323 INFO BlockManagerInfo - Removed broadcast_381_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.9 MiB)
18:46:36.338 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:36.339 INFO DAGScheduler - Registering RDD 930 (mapToPair at SparkUtils.java:161) as input to shuffle 39
18:46:36.339 INFO DAGScheduler - Got job 145 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:36.339 INFO DAGScheduler - Final stage: ResultStage 196 (runJob at SparkHadoopWriter.scala:83)
18:46:36.339 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 195)
18:46:36.339 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 195)
18:46:36.339 INFO DAGScheduler - Submitting ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:36.356 INFO MemoryStore - Block broadcast_386 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
18:46:36.358 INFO MemoryStore - Block broadcast_386_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
18:46:36.358 INFO BlockManagerInfo - Added broadcast_386_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.8 MiB)
18:46:36.358 INFO SparkContext - Created broadcast 386 from broadcast at DAGScheduler.scala:1580
18:46:36.358 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:36.358 INFO TaskSchedulerImpl - Adding task set 195.0 with 1 tasks resource profile 0
18:46:36.359 INFO TaskSetManager - Starting task 0.0 in stage 195.0 (TID 251) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
18:46:36.359 INFO Executor - Running task 0.0 in stage 195.0 (TID 251)
18:46:36.392 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:36.412 INFO Executor - Finished task 0.0 in stage 195.0 (TID 251). 1148 bytes result sent to driver
18:46:36.413 INFO TaskSetManager - Finished task 0.0 in stage 195.0 (TID 251) in 54 ms on localhost (executor driver) (1/1)
18:46:36.413 INFO TaskSchedulerImpl - Removed TaskSet 195.0, whose tasks have all completed, from pool
18:46:36.413 INFO DAGScheduler - ShuffleMapStage 195 (mapToPair at SparkUtils.java:161) finished in 0.073 s
18:46:36.413 INFO DAGScheduler - looking for newly runnable stages
18:46:36.413 INFO DAGScheduler - running: HashSet()
18:46:36.413 INFO DAGScheduler - waiting: HashSet(ResultStage 196)
18:46:36.413 INFO DAGScheduler - failed: HashSet()
18:46:36.413 INFO DAGScheduler - Submitting ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91), which has no missing parents
18:46:36.420 INFO MemoryStore - Block broadcast_387 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
18:46:36.421 INFO MemoryStore - Block broadcast_387_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
18:46:36.421 INFO BlockManagerInfo - Added broadcast_387_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.7 MiB)
18:46:36.421 INFO SparkContext - Created broadcast 387 from broadcast at DAGScheduler.scala:1580
18:46:36.421 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:36.421 INFO TaskSchedulerImpl - Adding task set 196.0 with 1 tasks resource profile 0
18:46:36.422 INFO TaskSetManager - Starting task 0.0 in stage 196.0 (TID 252) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:36.422 INFO Executor - Running task 0.0 in stage 196.0 (TID 252)
18:46:36.426 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:36.426 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:36.437 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:36.437 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:36.437 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:36.437 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:36.437 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:36.437 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:36.462 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846363451865846865224052_0935_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest2.someOtherPlace1075944347081950561/_temporary/0/task_202505191846363451865846865224052_0935_r_000000
18:46:36.462 INFO SparkHadoopMapRedUtil - attempt_202505191846363451865846865224052_0935_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:36.463 INFO Executor - Finished task 0.0 in stage 196.0 (TID 252). 1858 bytes result sent to driver
18:46:36.463 INFO TaskSetManager - Finished task 0.0 in stage 196.0 (TID 252) in 41 ms on localhost (executor driver) (1/1)
18:46:36.463 INFO TaskSchedulerImpl - Removed TaskSet 196.0, whose tasks have all completed, from pool
18:46:36.463 INFO DAGScheduler - ResultStage 196 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
18:46:36.463 INFO DAGScheduler - Job 145 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.463 INFO TaskSchedulerImpl - Killing all running tasks in stage 196: Stage finished
18:46:36.464 INFO DAGScheduler - Job 145 finished: runJob at SparkHadoopWriter.scala:83, took 0.125334 s
18:46:36.464 INFO SparkHadoopWriter - Start to commit write Job job_202505191846363451865846865224052_0935.
18:46:36.469 INFO SparkHadoopWriter - Write Job job_202505191846363451865846865224052_0935 committed. Elapsed time: 4 ms.
18:46:36.481 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest24086021107251301756.bam
18:46:36.485 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest24086021107251301756.bam done
18:46:36.485 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace1075944347081950561 to /tmp/ReadsSparkSinkUnitTest24086021107251301756.bam.sbi
18:46:36.490 INFO IndexFileMerger - Done merging .sbi files
18:46:36.490 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace1075944347081950561 to /tmp/ReadsSparkSinkUnitTest24086021107251301756.bam.bai
18:46:36.495 INFO IndexFileMerger - Done merging .bai files
18:46:36.497 INFO MemoryStore - Block broadcast_388 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
18:46:36.497 INFO MemoryStore - Block broadcast_388_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
18:46:36.497 INFO BlockManagerInfo - Added broadcast_388_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.7 MiB)
18:46:36.498 INFO SparkContext - Created broadcast 388 from broadcast at BamSource.java:104
18:46:36.499 INFO MemoryStore - Block broadcast_389 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:36.509 INFO MemoryStore - Block broadcast_389_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:36.509 INFO BlockManagerInfo - Added broadcast_389_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:36.509 INFO SparkContext - Created broadcast 389 from newAPIHadoopFile at PathSplitSource.java:96
18:46:36.523 INFO FileInputFormat - Total input files to process : 1
18:46:36.537 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:36.537 INFO DAGScheduler - Got job 146 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:36.537 INFO DAGScheduler - Final stage: ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:36.537 INFO DAGScheduler - Parents of final stage: List()
18:46:36.537 INFO DAGScheduler - Missing parents: List()
18:46:36.537 INFO DAGScheduler - Submitting ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.544 INFO MemoryStore - Block broadcast_390 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
18:46:36.544 INFO MemoryStore - Block broadcast_390_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
18:46:36.544 INFO BlockManagerInfo - Added broadcast_390_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:36.545 INFO SparkContext - Created broadcast 390 from broadcast at DAGScheduler.scala:1580
18:46:36.545 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.545 INFO TaskSchedulerImpl - Adding task set 197.0 with 1 tasks resource profile 0
18:46:36.545 INFO TaskSetManager - Starting task 0.0 in stage 197.0 (TID 253) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:36.545 INFO Executor - Running task 0.0 in stage 197.0 (TID 253)
18:46:36.557 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest24086021107251301756.bam:0+235514
18:46:36.561 INFO Executor - Finished task 0.0 in stage 197.0 (TID 253). 650141 bytes result sent to driver
18:46:36.562 INFO TaskSetManager - Finished task 0.0 in stage 197.0 (TID 253) in 17 ms on localhost (executor driver) (1/1)
18:46:36.562 INFO TaskSchedulerImpl - Removed TaskSet 197.0, whose tasks have all completed, from pool
18:46:36.563 INFO DAGScheduler - ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.024 s
18:46:36.563 INFO DAGScheduler - Job 146 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.563 INFO TaskSchedulerImpl - Killing all running tasks in stage 197: Stage finished
18:46:36.563 INFO DAGScheduler - Job 146 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025708 s
18:46:36.572 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:36.572 INFO DAGScheduler - Got job 147 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:36.572 INFO DAGScheduler - Final stage: ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185)
18:46:36.572 INFO DAGScheduler - Parents of final stage: List()
18:46:36.572 INFO DAGScheduler - Missing parents: List()
18:46:36.572 INFO DAGScheduler - Submitting ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.589 INFO MemoryStore - Block broadcast_391 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
18:46:36.590 INFO MemoryStore - Block broadcast_391_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
18:46:36.590 INFO BlockManagerInfo - Added broadcast_391_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:36.590 INFO SparkContext - Created broadcast 391 from broadcast at DAGScheduler.scala:1580
18:46:36.591 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.591 INFO TaskSchedulerImpl - Adding task set 198.0 with 1 tasks resource profile 0
18:46:36.591 INFO TaskSetManager - Starting task 0.0 in stage 198.0 (TID 254) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
18:46:36.591 INFO Executor - Running task 0.0 in stage 198.0 (TID 254)
18:46:36.621 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:36.632 INFO Executor - Finished task 0.0 in stage 198.0 (TID 254). 989 bytes result sent to driver
18:46:36.632 INFO TaskSetManager - Finished task 0.0 in stage 198.0 (TID 254) in 41 ms on localhost (executor driver) (1/1)
18:46:36.632 INFO TaskSchedulerImpl - Removed TaskSet 198.0, whose tasks have all completed, from pool
18:46:36.632 INFO DAGScheduler - ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
18:46:36.632 INFO DAGScheduler - Job 147 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.632 INFO TaskSchedulerImpl - Killing all running tasks in stage 198: Stage finished
18:46:36.633 INFO DAGScheduler - Job 147 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060705 s
18:46:36.636 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:36.636 INFO DAGScheduler - Got job 148 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:36.636 INFO DAGScheduler - Final stage: ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185)
18:46:36.636 INFO DAGScheduler - Parents of final stage: List()
18:46:36.636 INFO DAGScheduler - Missing parents: List()
18:46:36.636 INFO DAGScheduler - Submitting ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.642 INFO MemoryStore - Block broadcast_392 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
18:46:36.643 INFO MemoryStore - Block broadcast_392_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
18:46:36.643 INFO BlockManagerInfo - Added broadcast_392_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.4 MiB)
18:46:36.643 INFO SparkContext - Created broadcast 392 from broadcast at DAGScheduler.scala:1580
18:46:36.643 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.643 INFO TaskSchedulerImpl - Adding task set 199.0 with 1 tasks resource profile 0
18:46:36.644 INFO TaskSetManager - Starting task 0.0 in stage 199.0 (TID 255) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
18:46:36.644 INFO Executor - Running task 0.0 in stage 199.0 (TID 255)
18:46:36.655 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest24086021107251301756.bam:0+235514
18:46:36.659 INFO Executor - Finished task 0.0 in stage 199.0 (TID 255). 989 bytes result sent to driver
18:46:36.660 INFO TaskSetManager - Finished task 0.0 in stage 199.0 (TID 255) in 16 ms on localhost (executor driver) (1/1)
18:46:36.660 INFO TaskSchedulerImpl - Removed TaskSet 199.0, whose tasks have all completed, from pool
18:46:36.660 INFO DAGScheduler - ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.024 s
18:46:36.660 INFO DAGScheduler - Job 148 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.660 INFO TaskSchedulerImpl - Killing all running tasks in stage 199: Stage finished
18:46:36.660 INFO DAGScheduler - Job 148 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.024313 s
18:46:36.669 INFO MemoryStore - Block broadcast_393 stored as values in memory (estimated size 298.0 KiB, free 1916.8 MiB)
18:46:36.675 INFO MemoryStore - Block broadcast_393_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
18:46:36.675 INFO BlockManagerInfo - Added broadcast_393_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:36.676 INFO SparkContext - Created broadcast 393 from newAPIHadoopFile at PathSplitSource.java:96
18:46:36.696 INFO MemoryStore - Block broadcast_394 stored as values in memory (estimated size 298.0 KiB, free 1916.4 MiB)
18:46:36.702 INFO MemoryStore - Block broadcast_394_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:36.702 INFO BlockManagerInfo - Added broadcast_394_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:36.702 INFO SparkContext - Created broadcast 394 from newAPIHadoopFile at PathSplitSource.java:96
18:46:36.722 INFO FileInputFormat - Total input files to process : 1
18:46:36.723 INFO MemoryStore - Block broadcast_395 stored as values in memory (estimated size 19.6 KiB, free 1916.4 MiB)
18:46:36.724 INFO MemoryStore - Block broadcast_395_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
18:46:36.724 INFO BlockManagerInfo - Added broadcast_395_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.3 MiB)
18:46:36.724 INFO SparkContext - Created broadcast 395 from broadcast at ReadsSparkSink.java:133
18:46:36.725 INFO MemoryStore - Block broadcast_396 stored as values in memory (estimated size 20.0 KiB, free 1916.3 MiB)
18:46:36.725 INFO MemoryStore - Block broadcast_396_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
18:46:36.725 INFO BlockManagerInfo - Added broadcast_396_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.3 MiB)
18:46:36.725 INFO SparkContext - Created broadcast 396 from broadcast at BamSink.java:76
18:46:36.727 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:36.727 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:36.727 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:36.743 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:36.744 INFO DAGScheduler - Registering RDD 955 (mapToPair at SparkUtils.java:161) as input to shuffle 40
18:46:36.744 INFO DAGScheduler - Got job 149 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:36.744 INFO DAGScheduler - Final stage: ResultStage 201 (runJob at SparkHadoopWriter.scala:83)
18:46:36.744 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 200)
18:46:36.744 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 200)
18:46:36.744 INFO DAGScheduler - Submitting ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:36.761 INFO MemoryStore - Block broadcast_397 stored as values in memory (estimated size 434.3 KiB, free 1915.9 MiB)
18:46:36.768 INFO BlockManagerInfo - Removed broadcast_385_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:36.769 INFO MemoryStore - Block broadcast_397_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1915.9 MiB)
18:46:36.769 INFO BlockManagerInfo - Added broadcast_397_piece0 in memory on localhost:45727 (size: 157.6 KiB, free: 1919.2 MiB)
18:46:36.769 INFO BlockManagerInfo - Removed broadcast_388_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.2 MiB)
18:46:36.769 INFO SparkContext - Created broadcast 397 from broadcast at DAGScheduler.scala:1580
18:46:36.769 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:36.769 INFO TaskSchedulerImpl - Adding task set 200.0 with 1 tasks resource profile 0
18:46:36.770 INFO BlockManagerInfo - Removed broadcast_386_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.3 MiB)
18:46:36.770 INFO TaskSetManager - Starting task 0.0 in stage 200.0 (TID 256) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
18:46:36.770 INFO BlockManagerInfo - Removed broadcast_384_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:36.770 INFO Executor - Running task 0.0 in stage 200.0 (TID 256)
18:46:36.771 INFO BlockManagerInfo - Removed broadcast_389_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:36.771 INFO BlockManagerInfo - Removed broadcast_382_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.4 MiB)
18:46:36.772 INFO BlockManagerInfo - Removed broadcast_392_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.5 MiB)
18:46:36.772 INFO BlockManagerInfo - Removed broadcast_390_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.5 MiB)
18:46:36.773 INFO BlockManagerInfo - Removed broadcast_387_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.6 MiB)
18:46:36.773 INFO BlockManagerInfo - Removed broadcast_391_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:36.774 INFO BlockManagerInfo - Removed broadcast_394_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:36.801 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:36.813 INFO Executor - Finished task 0.0 in stage 200.0 (TID 256). 1148 bytes result sent to driver
18:46:36.814 INFO TaskSetManager - Finished task 0.0 in stage 200.0 (TID 256) in 44 ms on localhost (executor driver) (1/1)
18:46:36.814 INFO TaskSchedulerImpl - Removed TaskSet 200.0, whose tasks have all completed, from pool
18:46:36.814 INFO DAGScheduler - ShuffleMapStage 200 (mapToPair at SparkUtils.java:161) finished in 0.070 s
18:46:36.814 INFO DAGScheduler - looking for newly runnable stages
18:46:36.814 INFO DAGScheduler - running: HashSet()
18:46:36.814 INFO DAGScheduler - waiting: HashSet(ResultStage 201)
18:46:36.814 INFO DAGScheduler - failed: HashSet()
18:46:36.814 INFO DAGScheduler - Submitting ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91), which has no missing parents
18:46:36.820 INFO MemoryStore - Block broadcast_398 stored as values in memory (estimated size 155.3 KiB, free 1918.9 MiB)
18:46:36.821 INFO MemoryStore - Block broadcast_398_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1918.8 MiB)
18:46:36.821 INFO BlockManagerInfo - Added broadcast_398_piece0 in memory on localhost:45727 (size: 58.5 KiB, free: 1919.7 MiB)
18:46:36.821 INFO SparkContext - Created broadcast 398 from broadcast at DAGScheduler.scala:1580
18:46:36.821 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:36.821 INFO TaskSchedulerImpl - Adding task set 201.0 with 1 tasks resource profile 0
18:46:36.822 INFO TaskSetManager - Starting task 0.0 in stage 201.0 (TID 257) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:36.822 INFO Executor - Running task 0.0 in stage 201.0 (TID 257)
18:46:36.826 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:36.826 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:36.837 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:36.837 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:36.837 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:36.837 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:36.837 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:36.837 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:36.859 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846363249498061705342646_0960_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest3.someOtherPlace1569542195570973859/_temporary/0/task_202505191846363249498061705342646_0960_r_000000
18:46:36.859 INFO SparkHadoopMapRedUtil - attempt_202505191846363249498061705342646_0960_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:36.860 INFO Executor - Finished task 0.0 in stage 201.0 (TID 257). 1858 bytes result sent to driver
18:46:36.860 INFO TaskSetManager - Finished task 0.0 in stage 201.0 (TID 257) in 38 ms on localhost (executor driver) (1/1)
18:46:36.860 INFO TaskSchedulerImpl - Removed TaskSet 201.0, whose tasks have all completed, from pool
18:46:36.860 INFO DAGScheduler - ResultStage 201 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
18:46:36.861 INFO DAGScheduler - Job 149 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.861 INFO TaskSchedulerImpl - Killing all running tasks in stage 201: Stage finished
18:46:36.861 INFO DAGScheduler - Job 149 finished: runJob at SparkHadoopWriter.scala:83, took 0.117413 s
18:46:36.861 INFO SparkHadoopWriter - Start to commit write Job job_202505191846363249498061705342646_0960.
18:46:36.866 INFO SparkHadoopWriter - Write Job job_202505191846363249498061705342646_0960 committed. Elapsed time: 4 ms.
18:46:36.876 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest317004859797815163255.bam
18:46:36.880 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest317004859797815163255.bam done
18:46:36.880 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace1569542195570973859 to /tmp/ReadsSparkSinkUnitTest317004859797815163255.bam.sbi
18:46:36.885 INFO IndexFileMerger - Done merging .sbi files
18:46:36.885 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace1569542195570973859 to /tmp/ReadsSparkSinkUnitTest317004859797815163255.bam.bai
18:46:36.889 INFO IndexFileMerger - Done merging .bai files
18:46:36.890 INFO MemoryStore - Block broadcast_399 stored as values in memory (estimated size 312.0 B, free 1918.8 MiB)
18:46:36.891 INFO MemoryStore - Block broadcast_399_piece0 stored as bytes in memory (estimated size 231.0 B, free 1918.8 MiB)
18:46:36.891 INFO BlockManagerInfo - Added broadcast_399_piece0 in memory on localhost:45727 (size: 231.0 B, free: 1919.7 MiB)
18:46:36.891 INFO SparkContext - Created broadcast 399 from broadcast at BamSource.java:104
18:46:36.892 INFO MemoryStore - Block broadcast_400 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
18:46:36.898 INFO MemoryStore - Block broadcast_400_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.5 MiB)
18:46:36.898 INFO BlockManagerInfo - Added broadcast_400_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:36.898 INFO SparkContext - Created broadcast 400 from newAPIHadoopFile at PathSplitSource.java:96
18:46:36.907 INFO FileInputFormat - Total input files to process : 1
18:46:36.921 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:36.921 INFO DAGScheduler - Got job 150 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:36.921 INFO DAGScheduler - Final stage: ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:36.921 INFO DAGScheduler - Parents of final stage: List()
18:46:36.921 INFO DAGScheduler - Missing parents: List()
18:46:36.921 INFO DAGScheduler - Submitting ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.927 INFO MemoryStore - Block broadcast_401 stored as values in memory (estimated size 148.2 KiB, free 1918.3 MiB)
18:46:36.928 INFO MemoryStore - Block broadcast_401_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.3 MiB)
18:46:36.928 INFO BlockManagerInfo - Added broadcast_401_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.6 MiB)
18:46:36.928 INFO SparkContext - Created broadcast 401 from broadcast at DAGScheduler.scala:1580
18:46:36.928 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.928 INFO TaskSchedulerImpl - Adding task set 202.0 with 1 tasks resource profile 0
18:46:36.929 INFO TaskSetManager - Starting task 0.0 in stage 202.0 (TID 258) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:36.929 INFO Executor - Running task 0.0 in stage 202.0 (TID 258)
18:46:36.941 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest317004859797815163255.bam:0+236517
18:46:36.945 INFO Executor - Finished task 0.0 in stage 202.0 (TID 258). 749470 bytes result sent to driver
18:46:36.946 INFO TaskSetManager - Finished task 0.0 in stage 202.0 (TID 258) in 17 ms on localhost (executor driver) (1/1)
18:46:36.946 INFO TaskSchedulerImpl - Removed TaskSet 202.0, whose tasks have all completed, from pool
18:46:36.947 INFO DAGScheduler - ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
18:46:36.947 INFO DAGScheduler - Job 150 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:36.947 INFO TaskSchedulerImpl - Killing all running tasks in stage 202: Stage finished
18:46:36.947 INFO DAGScheduler - Job 150 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025891 s
18:46:36.958 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:36.958 INFO DAGScheduler - Got job 151 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:36.958 INFO DAGScheduler - Final stage: ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185)
18:46:36.958 INFO DAGScheduler - Parents of final stage: List()
18:46:36.958 INFO DAGScheduler - Missing parents: List()
18:46:36.958 INFO DAGScheduler - Submitting ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:36.975 INFO MemoryStore - Block broadcast_402 stored as values in memory (estimated size 426.1 KiB, free 1917.9 MiB)
18:46:36.976 INFO MemoryStore - Block broadcast_402_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.7 MiB)
18:46:36.976 INFO BlockManagerInfo - Added broadcast_402_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:36.976 INFO SparkContext - Created broadcast 402 from broadcast at DAGScheduler.scala:1580
18:46:36.976 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:36.976 INFO TaskSchedulerImpl - Adding task set 203.0 with 1 tasks resource profile 0
18:46:36.977 INFO TaskSetManager - Starting task 0.0 in stage 203.0 (TID 259) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
18:46:36.977 INFO Executor - Running task 0.0 in stage 203.0 (TID 259)
18:46:37.006 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:37.013 INFO Executor - Finished task 0.0 in stage 203.0 (TID 259). 989 bytes result sent to driver
18:46:37.013 INFO TaskSetManager - Finished task 0.0 in stage 203.0 (TID 259) in 36 ms on localhost (executor driver) (1/1)
18:46:37.013 INFO TaskSchedulerImpl - Removed TaskSet 203.0, whose tasks have all completed, from pool
18:46:37.013 INFO DAGScheduler - ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.055 s
18:46:37.013 INFO DAGScheduler - Job 151 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.013 INFO TaskSchedulerImpl - Killing all running tasks in stage 203: Stage finished
18:46:37.013 INFO DAGScheduler - Job 151 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.055837 s
18:46:37.017 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:37.017 INFO DAGScheduler - Got job 152 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:37.017 INFO DAGScheduler - Final stage: ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185)
18:46:37.017 INFO DAGScheduler - Parents of final stage: List()
18:46:37.017 INFO DAGScheduler - Missing parents: List()
18:46:37.017 INFO DAGScheduler - Submitting ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:37.023 INFO MemoryStore - Block broadcast_403 stored as values in memory (estimated size 148.1 KiB, free 1917.6 MiB)
18:46:37.024 INFO MemoryStore - Block broadcast_403_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.5 MiB)
18:46:37.024 INFO BlockManagerInfo - Added broadcast_403_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.4 MiB)
18:46:37.024 INFO SparkContext - Created broadcast 403 from broadcast at DAGScheduler.scala:1580
18:46:37.024 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:37.024 INFO TaskSchedulerImpl - Adding task set 204.0 with 1 tasks resource profile 0
18:46:37.024 INFO TaskSetManager - Starting task 0.0 in stage 204.0 (TID 260) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:37.025 INFO Executor - Running task 0.0 in stage 204.0 (TID 260)
18:46:37.035 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest317004859797815163255.bam:0+236517
18:46:37.038 INFO Executor - Finished task 0.0 in stage 204.0 (TID 260). 989 bytes result sent to driver
18:46:37.038 INFO TaskSetManager - Finished task 0.0 in stage 204.0 (TID 260) in 14 ms on localhost (executor driver) (1/1)
18:46:37.038 INFO TaskSchedulerImpl - Removed TaskSet 204.0, whose tasks have all completed, from pool
18:46:37.038 INFO DAGScheduler - ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.021 s
18:46:37.038 INFO DAGScheduler - Job 152 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.038 INFO TaskSchedulerImpl - Killing all running tasks in stage 204: Stage finished
18:46:37.038 INFO DAGScheduler - Job 152 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.021838 s
18:46:37.046 INFO MemoryStore - Block broadcast_404 stored as values in memory (estimated size 576.0 B, free 1917.5 MiB)
18:46:37.046 INFO MemoryStore - Block broadcast_404_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.5 MiB)
18:46:37.046 INFO BlockManagerInfo - Added broadcast_404_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.4 MiB)
18:46:37.047 INFO SparkContext - Created broadcast 404 from broadcast at CramSource.java:114
18:46:37.047 INFO MemoryStore - Block broadcast_405 stored as values in memory (estimated size 297.9 KiB, free 1917.2 MiB)
18:46:37.053 INFO MemoryStore - Block broadcast_405_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.2 MiB)
18:46:37.053 INFO BlockManagerInfo - Added broadcast_405_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:37.054 INFO SparkContext - Created broadcast 405 from newAPIHadoopFile at PathSplitSource.java:96
18:46:37.069 INFO MemoryStore - Block broadcast_406 stored as values in memory (estimated size 576.0 B, free 1917.2 MiB)
18:46:37.069 INFO MemoryStore - Block broadcast_406_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.2 MiB)
18:46:37.069 INFO BlockManagerInfo - Added broadcast_406_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.4 MiB)
18:46:37.070 INFO SparkContext - Created broadcast 406 from broadcast at CramSource.java:114
18:46:37.070 INFO MemoryStore - Block broadcast_407 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
18:46:37.076 INFO MemoryStore - Block broadcast_407_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
18:46:37.076 INFO BlockManagerInfo - Added broadcast_407_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:37.077 INFO SparkContext - Created broadcast 407 from newAPIHadoopFile at PathSplitSource.java:96
18:46:37.090 INFO FileInputFormat - Total input files to process : 1
18:46:37.092 INFO MemoryStore - Block broadcast_408 stored as values in memory (estimated size 6.0 KiB, free 1916.8 MiB)
18:46:37.092 INFO MemoryStore - Block broadcast_408_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.8 MiB)
18:46:37.092 INFO BlockManagerInfo - Added broadcast_408_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.3 MiB)
18:46:37.092 INFO SparkContext - Created broadcast 408 from broadcast at ReadsSparkSink.java:133
18:46:37.093 INFO MemoryStore - Block broadcast_409 stored as values in memory (estimated size 6.2 KiB, free 1916.8 MiB)
18:46:37.093 INFO MemoryStore - Block broadcast_409_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.8 MiB)
18:46:37.093 INFO BlockManagerInfo - Added broadcast_409_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.3 MiB)
18:46:37.094 INFO SparkContext - Created broadcast 409 from broadcast at CramSink.java:76
18:46:37.096 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:37.096 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.096 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.112 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:37.112 INFO DAGScheduler - Registering RDD 978 (mapToPair at SparkUtils.java:161) as input to shuffle 41
18:46:37.112 INFO DAGScheduler - Got job 153 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:37.112 INFO DAGScheduler - Final stage: ResultStage 206 (runJob at SparkHadoopWriter.scala:83)
18:46:37.112 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 205)
18:46:37.113 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 205)
18:46:37.113 INFO DAGScheduler - Submitting ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:37.124 INFO MemoryStore - Block broadcast_410 stored as values in memory (estimated size 292.8 KiB, free 1916.5 MiB)
18:46:37.128 INFO BlockManagerInfo - Removed broadcast_400_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:37.129 INFO BlockManagerInfo - Removed broadcast_398_piece0 on localhost:45727 in memory (size: 58.5 KiB, free: 1919.4 MiB)
18:46:37.129 INFO MemoryStore - Block broadcast_410_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.0 MiB)
18:46:37.129 INFO BlockManagerInfo - Added broadcast_410_piece0 in memory on localhost:45727 (size: 107.3 KiB, free: 1919.3 MiB)
18:46:37.129 INFO SparkContext - Created broadcast 410 from broadcast at DAGScheduler.scala:1580
18:46:37.129 INFO BlockManagerInfo - Removed broadcast_402_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.5 MiB)
18:46:37.129 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:37.129 INFO TaskSchedulerImpl - Adding task set 205.0 with 1 tasks resource profile 0
18:46:37.130 INFO BlockManagerInfo - Removed broadcast_393_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:37.130 INFO BlockManagerInfo - Removed broadcast_396_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.5 MiB)
18:46:37.130 INFO TaskSetManager - Starting task 0.0 in stage 205.0 (TID 261) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
18:46:37.130 INFO Executor - Running task 0.0 in stage 205.0 (TID 261)
18:46:37.130 INFO BlockManagerInfo - Removed broadcast_403_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.6 MiB)
18:46:37.131 INFO BlockManagerInfo - Removed broadcast_401_piece0 on localhost:45727 in memory (size: 54.5 KiB, free: 1919.6 MiB)
18:46:37.132 INFO BlockManagerInfo - Removed broadcast_407_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:37.132 INFO BlockManagerInfo - Removed broadcast_399_piece0 on localhost:45727 in memory (size: 231.0 B, free: 1919.7 MiB)
18:46:37.133 INFO BlockManagerInfo - Removed broadcast_395_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.7 MiB)
18:46:37.133 INFO BlockManagerInfo - Removed broadcast_397_piece0 on localhost:45727 in memory (size: 157.6 KiB, free: 1919.8 MiB)
18:46:37.134 INFO BlockManagerInfo - Removed broadcast_406_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.8 MiB)
18:46:37.152 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:37.162 INFO Executor - Finished task 0.0 in stage 205.0 (TID 261). 1148 bytes result sent to driver
18:46:37.162 INFO TaskSetManager - Finished task 0.0 in stage 205.0 (TID 261) in 32 ms on localhost (executor driver) (1/1)
18:46:37.162 INFO TaskSchedulerImpl - Removed TaskSet 205.0, whose tasks have all completed, from pool
18:46:37.162 INFO DAGScheduler - ShuffleMapStage 205 (mapToPair at SparkUtils.java:161) finished in 0.049 s
18:46:37.162 INFO DAGScheduler - looking for newly runnable stages
18:46:37.162 INFO DAGScheduler - running: HashSet()
18:46:37.162 INFO DAGScheduler - waiting: HashSet(ResultStage 206)
18:46:37.162 INFO DAGScheduler - failed: HashSet()
18:46:37.163 INFO DAGScheduler - Submitting ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89), which has no missing parents
18:46:37.169 INFO MemoryStore - Block broadcast_411 stored as values in memory (estimated size 153.2 KiB, free 1919.1 MiB)
18:46:37.170 INFO MemoryStore - Block broadcast_411_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1919.0 MiB)
18:46:37.170 INFO BlockManagerInfo - Added broadcast_411_piece0 in memory on localhost:45727 (size: 58.1 KiB, free: 1919.8 MiB)
18:46:37.170 INFO SparkContext - Created broadcast 411 from broadcast at DAGScheduler.scala:1580
18:46:37.170 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
18:46:37.170 INFO TaskSchedulerImpl - Adding task set 206.0 with 1 tasks resource profile 0
18:46:37.171 INFO TaskSetManager - Starting task 0.0 in stage 206.0 (TID 262) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:37.171 INFO Executor - Running task 0.0 in stage 206.0 (TID 262)
18:46:37.175 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:37.175 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:37.181 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:37.181 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.181 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.181 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:37.181 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.181 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.224 INFO FileOutputCommitter - Saved output of task 'attempt_20250519184637318991254996099789_0983_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest5.someOtherPlace6014359296084345082/_temporary/0/task_20250519184637318991254996099789_0983_r_000000
18:46:37.224 INFO SparkHadoopMapRedUtil - attempt_20250519184637318991254996099789_0983_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:37.225 INFO Executor - Finished task 0.0 in stage 206.0 (TID 262). 1858 bytes result sent to driver
18:46:37.225 INFO TaskSetManager - Finished task 0.0 in stage 206.0 (TID 262) in 55 ms on localhost (executor driver) (1/1)
18:46:37.225 INFO TaskSchedulerImpl - Removed TaskSet 206.0, whose tasks have all completed, from pool
18:46:37.225 INFO DAGScheduler - ResultStage 206 (runJob at SparkHadoopWriter.scala:83) finished in 0.062 s
18:46:37.225 INFO DAGScheduler - Job 153 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.225 INFO TaskSchedulerImpl - Killing all running tasks in stage 206: Stage finished
18:46:37.226 INFO DAGScheduler - Job 153 finished: runJob at SparkHadoopWriter.scala:83, took 0.113545 s
18:46:37.226 INFO SparkHadoopWriter - Start to commit write Job job_20250519184637318991254996099789_0983.
18:46:37.231 INFO SparkHadoopWriter - Write Job job_20250519184637318991254996099789_0983 committed. Elapsed time: 5 ms.
18:46:37.244 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest51968667920873948298.cram
18:46:37.248 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest51968667920873948298.cram done
18:46:37.249 INFO MemoryStore - Block broadcast_412 stored as values in memory (estimated size 504.0 B, free 1919.0 MiB)
18:46:37.250 INFO MemoryStore - Block broadcast_412_piece0 stored as bytes in memory (estimated size 159.0 B, free 1919.0 MiB)
18:46:37.250 INFO BlockManagerInfo - Added broadcast_412_piece0 in memory on localhost:45727 (size: 159.0 B, free: 1919.8 MiB)
18:46:37.250 INFO SparkContext - Created broadcast 412 from broadcast at CramSource.java:114
18:46:37.251 INFO MemoryStore - Block broadcast_413 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
18:46:37.257 INFO MemoryStore - Block broadcast_413_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
18:46:37.258 INFO BlockManagerInfo - Added broadcast_413_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:37.258 INFO SparkContext - Created broadcast 413 from newAPIHadoopFile at PathSplitSource.java:96
18:46:37.272 INFO FileInputFormat - Total input files to process : 1
18:46:37.297 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:37.297 INFO DAGScheduler - Got job 154 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:37.297 INFO DAGScheduler - Final stage: ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:37.297 INFO DAGScheduler - Parents of final stage: List()
18:46:37.297 INFO DAGScheduler - Missing parents: List()
18:46:37.297 INFO DAGScheduler - Submitting ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:37.314 INFO MemoryStore - Block broadcast_414 stored as values in memory (estimated size 286.8 KiB, free 1918.4 MiB)
18:46:37.315 INFO MemoryStore - Block broadcast_414_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.3 MiB)
18:46:37.315 INFO BlockManagerInfo - Added broadcast_414_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.6 MiB)
18:46:37.315 INFO SparkContext - Created broadcast 414 from broadcast at DAGScheduler.scala:1580
18:46:37.316 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:37.316 INFO TaskSchedulerImpl - Adding task set 207.0 with 1 tasks resource profile 0
18:46:37.316 INFO TaskSetManager - Starting task 0.0 in stage 207.0 (TID 263) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:37.316 INFO Executor - Running task 0.0 in stage 207.0 (TID 263)
18:46:37.342 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest51968667920873948298.cram:0+43713
18:46:37.352 INFO Executor - Finished task 0.0 in stage 207.0 (TID 263). 154058 bytes result sent to driver
18:46:37.353 INFO TaskSetManager - Finished task 0.0 in stage 207.0 (TID 263) in 37 ms on localhost (executor driver) (1/1)
18:46:37.353 INFO TaskSchedulerImpl - Removed TaskSet 207.0, whose tasks have all completed, from pool
18:46:37.353 INFO DAGScheduler - ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.055 s
18:46:37.353 INFO DAGScheduler - Job 154 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.353 INFO TaskSchedulerImpl - Killing all running tasks in stage 207: Stage finished
18:46:37.353 INFO DAGScheduler - Job 154 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.056428 s
18:46:37.358 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:37.358 INFO DAGScheduler - Got job 155 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:37.358 INFO DAGScheduler - Final stage: ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185)
18:46:37.358 INFO DAGScheduler - Parents of final stage: List()
18:46:37.359 INFO DAGScheduler - Missing parents: List()
18:46:37.359 INFO DAGScheduler - Submitting ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:37.370 INFO MemoryStore - Block broadcast_415 stored as values in memory (estimated size 286.8 KiB, free 1918.0 MiB)
18:46:37.371 INFO MemoryStore - Block broadcast_415_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.9 MiB)
18:46:37.371 INFO BlockManagerInfo - Added broadcast_415_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.5 MiB)
18:46:37.371 INFO SparkContext - Created broadcast 415 from broadcast at DAGScheduler.scala:1580
18:46:37.371 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:37.371 INFO TaskSchedulerImpl - Adding task set 208.0 with 1 tasks resource profile 0
18:46:37.372 INFO TaskSetManager - Starting task 0.0 in stage 208.0 (TID 264) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
18:46:37.372 INFO Executor - Running task 0.0 in stage 208.0 (TID 264)
18:46:37.392 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:37.398 INFO Executor - Finished task 0.0 in stage 208.0 (TID 264). 989 bytes result sent to driver
18:46:37.398 INFO TaskSetManager - Finished task 0.0 in stage 208.0 (TID 264) in 26 ms on localhost (executor driver) (1/1)
18:46:37.398 INFO TaskSchedulerImpl - Removed TaskSet 208.0, whose tasks have all completed, from pool
18:46:37.399 INFO DAGScheduler - ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.040 s
18:46:37.399 INFO DAGScheduler - Job 155 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.399 INFO TaskSchedulerImpl - Killing all running tasks in stage 208: Stage finished
18:46:37.399 INFO DAGScheduler - Job 155 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.040727 s
18:46:37.403 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:37.404 INFO DAGScheduler - Got job 156 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:37.404 INFO DAGScheduler - Final stage: ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185)
18:46:37.404 INFO DAGScheduler - Parents of final stage: List()
18:46:37.404 INFO DAGScheduler - Missing parents: List()
18:46:37.404 INFO DAGScheduler - Submitting ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:37.415 INFO MemoryStore - Block broadcast_416 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
18:46:37.416 INFO MemoryStore - Block broadcast_416_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
18:46:37.416 INFO BlockManagerInfo - Added broadcast_416_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.4 MiB)
18:46:37.416 INFO SparkContext - Created broadcast 416 from broadcast at DAGScheduler.scala:1580
18:46:37.417 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:37.417 INFO TaskSchedulerImpl - Adding task set 209.0 with 1 tasks resource profile 0
18:46:37.417 INFO TaskSetManager - Starting task 0.0 in stage 209.0 (TID 265) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:37.417 INFO Executor - Running task 0.0 in stage 209.0 (TID 265)
18:46:37.437 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest51968667920873948298.cram:0+43713
18:46:37.446 INFO Executor - Finished task 0.0 in stage 209.0 (TID 265). 989 bytes result sent to driver
18:46:37.446 INFO TaskSetManager - Finished task 0.0 in stage 209.0 (TID 265) in 29 ms on localhost (executor driver) (1/1)
18:46:37.446 INFO TaskSchedulerImpl - Removed TaskSet 209.0, whose tasks have all completed, from pool
18:46:37.446 INFO DAGScheduler - ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.042 s
18:46:37.447 INFO DAGScheduler - Job 156 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.447 INFO TaskSchedulerImpl - Killing all running tasks in stage 209: Stage finished
18:46:37.447 INFO DAGScheduler - Job 156 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.043291 s
18:46:37.455 INFO MemoryStore - Block broadcast_417 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
18:46:37.461 INFO MemoryStore - Block broadcast_417_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.2 MiB)
18:46:37.462 INFO BlockManagerInfo - Added broadcast_417_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:37.462 INFO SparkContext - Created broadcast 417 from newAPIHadoopFile at PathSplitSource.java:96
18:46:37.487 INFO MemoryStore - Block broadcast_418 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
18:46:37.493 INFO MemoryStore - Block broadcast_418_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
18:46:37.493 INFO BlockManagerInfo - Added broadcast_418_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:37.494 INFO SparkContext - Created broadcast 418 from newAPIHadoopFile at PathSplitSource.java:96
18:46:37.513 INFO FileInputFormat - Total input files to process : 1
18:46:37.515 INFO MemoryStore - Block broadcast_419 stored as values in memory (estimated size 160.7 KiB, free 1916.7 MiB)
18:46:37.515 INFO MemoryStore - Block broadcast_419_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.7 MiB)
18:46:37.516 INFO BlockManagerInfo - Added broadcast_419_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:37.516 INFO SparkContext - Created broadcast 419 from broadcast at ReadsSparkSink.java:133
18:46:37.519 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:37.519 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.519 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.536 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:37.536 INFO DAGScheduler - Registering RDD 1003 (mapToPair at SparkUtils.java:161) as input to shuffle 42
18:46:37.536 INFO DAGScheduler - Got job 157 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:37.536 INFO DAGScheduler - Final stage: ResultStage 211 (runJob at SparkHadoopWriter.scala:83)
18:46:37.536 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 210)
18:46:37.536 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 210)
18:46:37.536 INFO DAGScheduler - Submitting ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:37.553 INFO MemoryStore - Block broadcast_420 stored as values in memory (estimated size 520.4 KiB, free 1916.2 MiB)
18:46:37.555 INFO MemoryStore - Block broadcast_420_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
18:46:37.555 INFO BlockManagerInfo - Added broadcast_420_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.2 MiB)
18:46:37.555 INFO SparkContext - Created broadcast 420 from broadcast at DAGScheduler.scala:1580
18:46:37.555 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:37.555 INFO TaskSchedulerImpl - Adding task set 210.0 with 1 tasks resource profile 0
18:46:37.556 INFO TaskSetManager - Starting task 0.0 in stage 210.0 (TID 266) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:37.556 INFO Executor - Running task 0.0 in stage 210.0 (TID 266)
18:46:37.586 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:37.601 INFO Executor - Finished task 0.0 in stage 210.0 (TID 266). 1148 bytes result sent to driver
18:46:37.601 INFO TaskSetManager - Finished task 0.0 in stage 210.0 (TID 266) in 45 ms on localhost (executor driver) (1/1)
18:46:37.601 INFO TaskSchedulerImpl - Removed TaskSet 210.0, whose tasks have all completed, from pool
18:46:37.601 INFO DAGScheduler - ShuffleMapStage 210 (mapToPair at SparkUtils.java:161) finished in 0.064 s
18:46:37.601 INFO DAGScheduler - looking for newly runnable stages
18:46:37.601 INFO DAGScheduler - running: HashSet()
18:46:37.601 INFO DAGScheduler - waiting: HashSet(ResultStage 211)
18:46:37.601 INFO DAGScheduler - failed: HashSet()
18:46:37.602 INFO DAGScheduler - Submitting ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65), which has no missing parents
18:46:37.608 INFO MemoryStore - Block broadcast_421 stored as values in memory (estimated size 241.1 KiB, free 1915.8 MiB)
18:46:37.609 INFO MemoryStore - Block broadcast_421_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1915.7 MiB)
18:46:37.609 INFO BlockManagerInfo - Added broadcast_421_piece0 in memory on localhost:45727 (size: 66.9 KiB, free: 1919.1 MiB)
18:46:37.609 INFO SparkContext - Created broadcast 421 from broadcast at DAGScheduler.scala:1580
18:46:37.609 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
18:46:37.609 INFO TaskSchedulerImpl - Adding task set 211.0 with 1 tasks resource profile 0
18:46:37.610 INFO TaskSetManager - Starting task 0.0 in stage 211.0 (TID 267) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:37.610 INFO Executor - Running task 0.0 in stage 211.0 (TID 267)
18:46:37.614 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:37.614 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:37.624 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:37.624 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.624 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.640 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846374439190105273513355_1009_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest6.someOtherPlace18250099430899577813/_temporary/0/task_202505191846374439190105273513355_1009_m_000000
18:46:37.641 INFO SparkHadoopMapRedUtil - attempt_202505191846374439190105273513355_1009_m_000000_0: Committed. Elapsed time: 0 ms.
18:46:37.641 INFO Executor - Finished task 0.0 in stage 211.0 (TID 267). 1858 bytes result sent to driver
18:46:37.641 INFO TaskSetManager - Finished task 0.0 in stage 211.0 (TID 267) in 31 ms on localhost (executor driver) (1/1)
18:46:37.641 INFO TaskSchedulerImpl - Removed TaskSet 211.0, whose tasks have all completed, from pool
18:46:37.641 INFO DAGScheduler - ResultStage 211 (runJob at SparkHadoopWriter.scala:83) finished in 0.039 s
18:46:37.642 INFO DAGScheduler - Job 157 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.642 INFO TaskSchedulerImpl - Killing all running tasks in stage 211: Stage finished
18:46:37.642 INFO DAGScheduler - Job 157 finished: runJob at SparkHadoopWriter.scala:83, took 0.106096 s
18:46:37.642 INFO SparkHadoopWriter - Start to commit write Job job_202505191846374439190105273513355_1009.
18:46:37.646 INFO SparkHadoopWriter - Write Job job_202505191846374439190105273513355_1009 committed. Elapsed time: 4 ms.
18:46:37.654 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest611881674894628395885.sam
18:46:37.659 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest611881674894628395885.sam done
WARNING 2025-05-19 18:46:37 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-05-19 18:46:37 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:37.664 INFO BlockManagerInfo - Removed broadcast_405_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:37.665 INFO BlockManagerInfo - Removed broadcast_409_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.1 MiB)
18:46:37.665 INFO MemoryStore - Block broadcast_422 stored as values in memory (estimated size 160.7 KiB, free 1915.9 MiB)
18:46:37.665 INFO BlockManagerInfo - Removed broadcast_408_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.2 MiB)
18:46:37.666 INFO MemoryStore - Block broadcast_422_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.9 MiB)
18:46:37.666 INFO BlockManagerInfo - Added broadcast_422_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.1 MiB)
18:46:37.666 INFO BlockManagerInfo - Removed broadcast_412_piece0 on localhost:45727 in memory (size: 159.0 B, free: 1919.1 MiB)
18:46:37.666 INFO SparkContext - Created broadcast 422 from broadcast at SamSource.java:78
18:46:37.666 INFO BlockManagerInfo - Removed broadcast_413_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:37.667 INFO BlockManagerInfo - Removed broadcast_416_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.3 MiB)
18:46:37.667 INFO BlockManagerInfo - Removed broadcast_410_piece0 on localhost:45727 in memory (size: 107.3 KiB, free: 1919.4 MiB)
18:46:37.667 INFO MemoryStore - Block broadcast_423 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
18:46:37.667 INFO BlockManagerInfo - Removed broadcast_414_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.5 MiB)
18:46:37.668 INFO BlockManagerInfo - Removed broadcast_420_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.7 MiB)
18:46:37.668 INFO BlockManagerInfo - Removed broadcast_419_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:37.669 INFO BlockManagerInfo - Removed broadcast_415_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.8 MiB)
18:46:37.669 INFO BlockManagerInfo - Removed broadcast_418_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:37.670 INFO BlockManagerInfo - Removed broadcast_421_piece0 on localhost:45727 in memory (size: 66.9 KiB, free: 1919.9 MiB)
18:46:37.670 INFO BlockManagerInfo - Removed broadcast_411_piece0 on localhost:45727 in memory (size: 58.1 KiB, free: 1919.9 MiB)
18:46:37.671 INFO BlockManagerInfo - Removed broadcast_404_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.9 MiB)
18:46:37.675 INFO MemoryStore - Block broadcast_423_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.2 MiB)
18:46:37.675 INFO BlockManagerInfo - Added broadcast_423_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.9 MiB)
18:46:37.675 INFO SparkContext - Created broadcast 423 from newAPIHadoopFile at SamSource.java:108
18:46:37.677 INFO FileInputFormat - Total input files to process : 1
18:46:37.681 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:37.681 INFO DAGScheduler - Got job 158 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:37.681 INFO DAGScheduler - Final stage: ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:37.681 INFO DAGScheduler - Parents of final stage: List()
18:46:37.681 INFO DAGScheduler - Missing parents: List()
18:46:37.681 INFO DAGScheduler - Submitting ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:37.681 INFO MemoryStore - Block broadcast_424 stored as values in memory (estimated size 7.5 KiB, free 1919.1 MiB)
18:46:37.682 INFO MemoryStore - Block broadcast_424_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1919.1 MiB)
18:46:37.682 INFO BlockManagerInfo - Added broadcast_424_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.9 MiB)
18:46:37.682 INFO SparkContext - Created broadcast 424 from broadcast at DAGScheduler.scala:1580
18:46:37.682 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:37.682 INFO TaskSchedulerImpl - Adding task set 212.0 with 1 tasks resource profile 0
18:46:37.683 INFO TaskSetManager - Starting task 0.0 in stage 212.0 (TID 268) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:37.683 INFO Executor - Running task 0.0 in stage 212.0 (TID 268)
18:46:37.684 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest611881674894628395885.sam:0+847558
18:46:37.703 INFO Executor - Finished task 0.0 in stage 212.0 (TID 268). 651483 bytes result sent to driver
18:46:37.705 INFO TaskSetManager - Finished task 0.0 in stage 212.0 (TID 268) in 23 ms on localhost (executor driver) (1/1)
18:46:37.705 INFO TaskSchedulerImpl - Removed TaskSet 212.0, whose tasks have all completed, from pool
18:46:37.705 INFO DAGScheduler - ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.024 s
18:46:37.705 INFO DAGScheduler - Job 158 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.705 INFO TaskSchedulerImpl - Killing all running tasks in stage 212: Stage finished
18:46:37.705 INFO DAGScheduler - Job 158 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.024548 s
18:46:37.715 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:37.715 INFO DAGScheduler - Got job 159 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:37.715 INFO DAGScheduler - Final stage: ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185)
18:46:37.715 INFO DAGScheduler - Parents of final stage: List()
18:46:37.715 INFO DAGScheduler - Missing parents: List()
18:46:37.715 INFO DAGScheduler - Submitting ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:37.738 INFO MemoryStore - Block broadcast_425 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
18:46:37.739 INFO MemoryStore - Block broadcast_425_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
18:46:37.739 INFO BlockManagerInfo - Added broadcast_425_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.7 MiB)
18:46:37.739 INFO SparkContext - Created broadcast 425 from broadcast at DAGScheduler.scala:1580
18:46:37.739 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:37.740 INFO TaskSchedulerImpl - Adding task set 213.0 with 1 tasks resource profile 0
18:46:37.740 INFO TaskSetManager - Starting task 0.0 in stage 213.0 (TID 269) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:37.740 INFO Executor - Running task 0.0 in stage 213.0 (TID 269)
18:46:37.770 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:37.779 INFO Executor - Finished task 0.0 in stage 213.0 (TID 269). 989 bytes result sent to driver
18:46:37.779 INFO TaskSetManager - Finished task 0.0 in stage 213.0 (TID 269) in 39 ms on localhost (executor driver) (1/1)
18:46:37.780 INFO TaskSchedulerImpl - Removed TaskSet 213.0, whose tasks have all completed, from pool
18:46:37.780 INFO DAGScheduler - ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
18:46:37.780 INFO DAGScheduler - Job 159 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.780 INFO TaskSchedulerImpl - Killing all running tasks in stage 213: Stage finished
18:46:37.780 INFO DAGScheduler - Job 159 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065195 s
18:46:37.784 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:37.784 INFO DAGScheduler - Got job 160 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:37.784 INFO DAGScheduler - Final stage: ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185)
18:46:37.784 INFO DAGScheduler - Parents of final stage: List()
18:46:37.784 INFO DAGScheduler - Missing parents: List()
18:46:37.785 INFO DAGScheduler - Submitting ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:37.785 INFO MemoryStore - Block broadcast_426 stored as values in memory (estimated size 7.4 KiB, free 1918.6 MiB)
18:46:37.786 INFO MemoryStore - Block broadcast_426_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.6 MiB)
18:46:37.786 INFO BlockManagerInfo - Added broadcast_426_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.7 MiB)
18:46:37.786 INFO SparkContext - Created broadcast 426 from broadcast at DAGScheduler.scala:1580
18:46:37.786 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:37.786 INFO TaskSchedulerImpl - Adding task set 214.0 with 1 tasks resource profile 0
18:46:37.786 INFO TaskSetManager - Starting task 0.0 in stage 214.0 (TID 270) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:37.787 INFO Executor - Running task 0.0 in stage 214.0 (TID 270)
18:46:37.788 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest611881674894628395885.sam:0+847558
18:46:37.795 INFO Executor - Finished task 0.0 in stage 214.0 (TID 270). 946 bytes result sent to driver
18:46:37.795 INFO TaskSetManager - Finished task 0.0 in stage 214.0 (TID 270) in 9 ms on localhost (executor driver) (1/1)
18:46:37.795 INFO TaskSchedulerImpl - Removed TaskSet 214.0, whose tasks have all completed, from pool
18:46:37.795 INFO DAGScheduler - ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.010 s
18:46:37.795 INFO DAGScheduler - Job 160 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:37.795 INFO TaskSchedulerImpl - Killing all running tasks in stage 214: Stage finished
18:46:37.795 INFO DAGScheduler - Job 160 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.010906 s
18:46:37.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:37.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:37.814 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:37.814 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:37.818 INFO MemoryStore - Block broadcast_427 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
18:46:37.828 INFO MemoryStore - Block broadcast_427_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
18:46:37.828 INFO BlockManagerInfo - Added broadcast_427_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:37.828 INFO SparkContext - Created broadcast 427 from newAPIHadoopFile at PathSplitSource.java:96
18:46:37.850 INFO MemoryStore - Block broadcast_428 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
18:46:37.856 INFO MemoryStore - Block broadcast_428_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
18:46:37.856 INFO BlockManagerInfo - Added broadcast_428_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:37.857 INFO SparkContext - Created broadcast 428 from newAPIHadoopFile at PathSplitSource.java:96
18:46:37.877 INFO FileInputFormat - Total input files to process : 1
18:46:37.879 INFO MemoryStore - Block broadcast_429 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
18:46:37.879 INFO MemoryStore - Block broadcast_429_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
18:46:37.879 INFO BlockManagerInfo - Added broadcast_429_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:37.880 INFO SparkContext - Created broadcast 429 from broadcast at ReadsSparkSink.java:133
18:46:37.881 INFO MemoryStore - Block broadcast_430 stored as values in memory (estimated size 163.2 KiB, free 1917.6 MiB)
18:46:37.881 INFO MemoryStore - Block broadcast_430_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.6 MiB)
18:46:37.881 INFO BlockManagerInfo - Added broadcast_430_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:37.882 INFO SparkContext - Created broadcast 430 from broadcast at BamSink.java:76
18:46:37.883 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts dst=null perm=null proto=rpc
18:46:37.884 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:37.884 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.884 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.885 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:37.891 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:37.891 INFO DAGScheduler - Registering RDD 1028 (mapToPair at SparkUtils.java:161) as input to shuffle 43
18:46:37.891 INFO DAGScheduler - Got job 161 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:37.891 INFO DAGScheduler - Final stage: ResultStage 216 (runJob at SparkHadoopWriter.scala:83)
18:46:37.891 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 215)
18:46:37.891 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 215)
18:46:37.891 INFO DAGScheduler - Submitting ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:37.917 INFO MemoryStore - Block broadcast_431 stored as values in memory (estimated size 520.4 KiB, free 1917.0 MiB)
18:46:37.918 INFO MemoryStore - Block broadcast_431_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.9 MiB)
18:46:37.918 INFO BlockManagerInfo - Added broadcast_431_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.5 MiB)
18:46:37.919 INFO SparkContext - Created broadcast 431 from broadcast at DAGScheduler.scala:1580
18:46:37.919 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:37.919 INFO TaskSchedulerImpl - Adding task set 215.0 with 1 tasks resource profile 0
18:46:37.919 INFO TaskSetManager - Starting task 0.0 in stage 215.0 (TID 271) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:37.919 INFO Executor - Running task 0.0 in stage 215.0 (TID 271)
18:46:37.950 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:37.965 INFO Executor - Finished task 0.0 in stage 215.0 (TID 271). 1148 bytes result sent to driver
18:46:37.965 INFO TaskSetManager - Finished task 0.0 in stage 215.0 (TID 271) in 46 ms on localhost (executor driver) (1/1)
18:46:37.965 INFO TaskSchedulerImpl - Removed TaskSet 215.0, whose tasks have all completed, from pool
18:46:37.966 INFO DAGScheduler - ShuffleMapStage 215 (mapToPair at SparkUtils.java:161) finished in 0.074 s
18:46:37.966 INFO DAGScheduler - looking for newly runnable stages
18:46:37.966 INFO DAGScheduler - running: HashSet()
18:46:37.966 INFO DAGScheduler - waiting: HashSet(ResultStage 216)
18:46:37.966 INFO DAGScheduler - failed: HashSet()
18:46:37.966 INFO DAGScheduler - Submitting ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91), which has no missing parents
18:46:37.972 INFO MemoryStore - Block broadcast_432 stored as values in memory (estimated size 241.5 KiB, free 1916.6 MiB)
18:46:37.973 INFO MemoryStore - Block broadcast_432_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.6 MiB)
18:46:37.973 INFO BlockManagerInfo - Added broadcast_432_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.4 MiB)
18:46:37.973 INFO SparkContext - Created broadcast 432 from broadcast at DAGScheduler.scala:1580
18:46:37.974 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:37.974 INFO TaskSchedulerImpl - Adding task set 216.0 with 1 tasks resource profile 0
18:46:37.974 INFO TaskSetManager - Starting task 0.0 in stage 216.0 (TID 272) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:37.974 INFO Executor - Running task 0.0 in stage 216.0 (TID 272)
18:46:37.978 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:37.979 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:37.990 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:37.990 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.990 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.990 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:37.990 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:37.990 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:37.991 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:37.992 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:37.993 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:37.995 INFO StateChange - BLOCK* allocate blk_1073741871_1047, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/part-r-00000
18:46:37.996 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741871_1047 src: /127.0.0.1:47232 dest: /127.0.0.1:38019
18:46:37.998 INFO clienttrace - src: /127.0.0.1:47232, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741871_1047, duration(ns): 1614935
18:46:37.998 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741871_1047, type=LAST_IN_PIPELINE terminating
18:46:37.999 INFO FSNamesystem - BLOCK* blk_1073741871_1047 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/part-r-00000
18:46:38.400 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:38.400 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:38.401 INFO StateChange - BLOCK* allocate blk_1073741872_1048, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/.part-r-00000.sbi
18:46:38.402 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741872_1048 src: /127.0.0.1:47240 dest: /127.0.0.1:38019
18:46:38.403 INFO clienttrace - src: /127.0.0.1:47240, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741872_1048, duration(ns): 451665
18:46:38.404 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741872_1048, type=LAST_IN_PIPELINE terminating
18:46:38.404 INFO FSNamesystem - BLOCK* blk_1073741872_1048 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/.part-r-00000.sbi
18:46:38.805 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:38.807 INFO StateChange - BLOCK* allocate blk_1073741873_1049, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/.part-r-00000.bai
18:46:38.808 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741873_1049 src: /127.0.0.1:47248 dest: /127.0.0.1:38019
18:46:38.809 INFO clienttrace - src: /127.0.0.1:47248, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741873_1049, duration(ns): 533115
18:46:38.809 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741873_1049, type=LAST_IN_PIPELINE terminating
18:46:38.810 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:38.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0 dst=null perm=null proto=rpc
18:46:38.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0 dst=null perm=null proto=rpc
18:46:38.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/task_202505191846375493223126718341372_1033_r_000000 dst=null perm=null proto=rpc
18:46:38.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/_temporary/attempt_202505191846375493223126718341372_1033_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/task_202505191846375493223126718341372_1033_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:38.813 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846375493223126718341372_1033_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/task_202505191846375493223126718341372_1033_r_000000
18:46:38.813 INFO SparkHadoopMapRedUtil - attempt_202505191846375493223126718341372_1033_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:38.814 INFO Executor - Finished task 0.0 in stage 216.0 (TID 272). 1858 bytes result sent to driver
18:46:38.814 INFO TaskSetManager - Finished task 0.0 in stage 216.0 (TID 272) in 840 ms on localhost (executor driver) (1/1)
18:46:38.814 INFO TaskSchedulerImpl - Removed TaskSet 216.0, whose tasks have all completed, from pool
18:46:38.814 INFO DAGScheduler - ResultStage 216 (runJob at SparkHadoopWriter.scala:83) finished in 0.848 s
18:46:38.814 INFO DAGScheduler - Job 161 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:38.814 INFO TaskSchedulerImpl - Killing all running tasks in stage 216: Stage finished
18:46:38.814 INFO DAGScheduler - Job 161 finished: runJob at SparkHadoopWriter.scala:83, took 0.923809 s
18:46:38.815 INFO SparkHadoopWriter - Start to commit write Job job_202505191846375493223126718341372_1033.
18:46:38.815 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:38.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts dst=null perm=null proto=rpc
18:46:38.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/task_202505191846375493223126718341372_1033_r_000000 dst=null perm=null proto=rpc
18:46:38.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:38.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/task_202505191846375493223126718341372_1033_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:38.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:38.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/task_202505191846375493223126718341372_1033_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:38.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:38.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary/0/task_202505191846375493223126718341372_1033_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:38.820 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:38.820 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:38.821 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:38.822 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.spark-staging-1033 dst=null perm=null proto=rpc
18:46:38.822 INFO SparkHadoopWriter - Write Job job_202505191846375493223126718341372_1033 committed. Elapsed time: 7 ms.
18:46:38.823 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:38.824 INFO StateChange - BLOCK* allocate blk_1073741874_1050, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/header
18:46:38.825 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741874_1050 src: /127.0.0.1:47250 dest: /127.0.0.1:38019
18:46:38.827 INFO clienttrace - src: /127.0.0.1:47250, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741874_1050, duration(ns): 513263
18:46:38.827 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741874_1050, type=LAST_IN_PIPELINE terminating
18:46:38.827 INFO FSNamesystem - BLOCK* blk_1073741874_1050 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/header
18:46:39.228 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:39.229 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:39.230 INFO StateChange - BLOCK* allocate blk_1073741875_1051, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/terminator
18:46:39.231 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741875_1051 src: /127.0.0.1:47262 dest: /127.0.0.1:38019
18:46:39.232 INFO clienttrace - src: /127.0.0.1:47262, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741875_1051, duration(ns): 381005
18:46:39.232 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741875_1051, type=LAST_IN_PIPELINE terminating
18:46:39.233 INFO FSNamesystem - BLOCK* blk_1073741875_1051 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/terminator
18:46:39.633 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:39.634 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts dst=null perm=null proto=rpc
18:46:39.635 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:39.636 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:39.636 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam
18:46:39.637 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:39.637 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:39.637 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:39.638 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:39.638 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam done
18:46:39.638 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:39.639 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi
18:46:39.639 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts dst=null perm=null proto=rpc
18:46:39.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:39.641 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:39.641 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:39.643 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:39.643 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:39.644 INFO StateChange - BLOCK* allocate blk_1073741876_1052, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi
18:46:39.645 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741876_1052 src: /127.0.0.1:47288 dest: /127.0.0.1:38019
18:46:39.646 INFO clienttrace - src: /127.0.0.1:47288, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741876_1052, duration(ns): 515182
18:46:39.646 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741876_1052, type=LAST_IN_PIPELINE terminating
18:46:39.647 INFO FSNamesystem - BLOCK* blk_1073741876_1052 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi
18:46:40.048 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:40.048 INFO IndexFileMerger - Done merging .sbi files
18:46:40.048 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai
18:46:40.049 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts dst=null perm=null proto=rpc
18:46:40.050 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:40.051 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:40.051 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:40.053 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:40.053 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:40.054 INFO StateChange - BLOCK* allocate blk_1073741877_1053, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai
18:46:40.055 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741877_1053 src: /127.0.0.1:47304 dest: /127.0.0.1:38019
18:46:40.056 INFO clienttrace - src: /127.0.0.1:47304, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741877_1053, duration(ns): 495696
18:46:40.057 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741877_1053, type=LAST_IN_PIPELINE terminating
18:46:40.058 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:40.058 INFO IndexFileMerger - Done merging .bai files
18:46:40.058 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.parts dst=null perm=null proto=rpc
18:46:40.067 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi dst=null perm=null proto=rpc
18:46:40.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi dst=null perm=null proto=rpc
18:46:40.076 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi dst=null perm=null proto=rpc
18:46:40.077 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:40.077 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.078 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.078 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.078 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.079 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.079 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.080 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.081 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:40.083 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:40.083 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:40.083 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:40.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi dst=null perm=null proto=rpc
18:46:40.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi dst=null perm=null proto=rpc
18:46:40.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.sbi dst=null perm=null proto=rpc
18:46:40.085 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:40.086 INFO MemoryStore - Block broadcast_433 stored as values in memory (estimated size 320.0 B, free 1916.6 MiB)
18:46:40.086 INFO MemoryStore - Block broadcast_433_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.6 MiB)
18:46:40.086 INFO BlockManagerInfo - Added broadcast_433_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.4 MiB)
18:46:40.086 INFO SparkContext - Created broadcast 433 from broadcast at BamSource.java:104
18:46:40.087 INFO MemoryStore - Block broadcast_434 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
18:46:40.093 INFO MemoryStore - Block broadcast_434_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.2 MiB)
18:46:40.093 INFO BlockManagerInfo - Added broadcast_434_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:40.093 INFO SparkContext - Created broadcast 434 from newAPIHadoopFile at PathSplitSource.java:96
18:46:40.102 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.103 INFO FileInputFormat - Total input files to process : 1
18:46:40.103 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.117 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:40.117 INFO DAGScheduler - Got job 162 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:40.117 INFO DAGScheduler - Final stage: ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:40.117 INFO DAGScheduler - Parents of final stage: List()
18:46:40.117 INFO DAGScheduler - Missing parents: List()
18:46:40.118 INFO DAGScheduler - Submitting ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:40.124 INFO MemoryStore - Block broadcast_435 stored as values in memory (estimated size 148.2 KiB, free 1916.1 MiB)
18:46:40.128 INFO MemoryStore - Block broadcast_435_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.1 MiB)
18:46:40.128 INFO BlockManagerInfo - Removed broadcast_424_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1919.3 MiB)
18:46:40.128 INFO BlockManagerInfo - Added broadcast_435_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.3 MiB)
18:46:40.128 INFO SparkContext - Created broadcast 435 from broadcast at DAGScheduler.scala:1580
18:46:40.128 INFO BlockManagerInfo - Removed broadcast_432_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.4 MiB)
18:46:40.128 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:40.128 INFO TaskSchedulerImpl - Adding task set 217.0 with 1 tasks resource profile 0
18:46:40.129 INFO BlockManagerInfo - Removed broadcast_423_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:40.129 INFO BlockManagerInfo - Removed broadcast_425_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.6 MiB)
18:46:40.129 INFO TaskSetManager - Starting task 0.0 in stage 217.0 (TID 273) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:40.130 INFO Executor - Running task 0.0 in stage 217.0 (TID 273)
18:46:40.130 INFO BlockManagerInfo - Removed broadcast_430_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:40.130 INFO BlockManagerInfo - Removed broadcast_426_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1919.6 MiB)
18:46:40.130 INFO BlockManagerInfo - Removed broadcast_417_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:40.131 INFO BlockManagerInfo - Removed broadcast_422_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:40.131 INFO BlockManagerInfo - Removed broadcast_429_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.6 MiB)
18:46:40.132 INFO BlockManagerInfo - Removed broadcast_428_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:40.132 INFO BlockManagerInfo - Removed broadcast_431_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.8 MiB)
18:46:40.142 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam:0+237038
18:46:40.142 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.143 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.144 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.144 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.145 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.146 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:40.148 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:40.148 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:40.150 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:40.150 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:40.152 INFO Executor - Finished task 0.0 in stage 217.0 (TID 273). 651483 bytes result sent to driver
18:46:40.154 INFO TaskSetManager - Finished task 0.0 in stage 217.0 (TID 273) in 25 ms on localhost (executor driver) (1/1)
18:46:40.154 INFO TaskSchedulerImpl - Removed TaskSet 217.0, whose tasks have all completed, from pool
18:46:40.154 INFO DAGScheduler - ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.036 s
18:46:40.154 INFO DAGScheduler - Job 162 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:40.154 INFO TaskSchedulerImpl - Killing all running tasks in stage 217: Stage finished
18:46:40.155 INFO DAGScheduler - Job 162 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.037493 s
18:46:40.167 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:40.167 INFO DAGScheduler - Got job 163 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:40.167 INFO DAGScheduler - Final stage: ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185)
18:46:40.167 INFO DAGScheduler - Parents of final stage: List()
18:46:40.167 INFO DAGScheduler - Missing parents: List()
18:46:40.168 INFO DAGScheduler - Submitting ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:40.184 INFO MemoryStore - Block broadcast_436 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
18:46:40.185 INFO MemoryStore - Block broadcast_436_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
18:46:40.186 INFO BlockManagerInfo - Added broadcast_436_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.7 MiB)
18:46:40.186 INFO SparkContext - Created broadcast 436 from broadcast at DAGScheduler.scala:1580
18:46:40.186 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:40.186 INFO TaskSchedulerImpl - Adding task set 218.0 with 1 tasks resource profile 0
18:46:40.186 INFO TaskSetManager - Starting task 0.0 in stage 218.0 (TID 274) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:40.187 INFO Executor - Running task 0.0 in stage 218.0 (TID 274)
18:46:40.215 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:40.225 INFO Executor - Finished task 0.0 in stage 218.0 (TID 274). 989 bytes result sent to driver
18:46:40.226 INFO TaskSetManager - Finished task 0.0 in stage 218.0 (TID 274) in 40 ms on localhost (executor driver) (1/1)
18:46:40.226 INFO TaskSchedulerImpl - Removed TaskSet 218.0, whose tasks have all completed, from pool
18:46:40.226 INFO DAGScheduler - ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
18:46:40.226 INFO DAGScheduler - Job 163 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:40.226 INFO TaskSchedulerImpl - Killing all running tasks in stage 218: Stage finished
18:46:40.226 INFO DAGScheduler - Job 163 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058659 s
18:46:40.231 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:40.231 INFO DAGScheduler - Got job 164 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:40.231 INFO DAGScheduler - Final stage: ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185)
18:46:40.231 INFO DAGScheduler - Parents of final stage: List()
18:46:40.231 INFO DAGScheduler - Missing parents: List()
18:46:40.231 INFO DAGScheduler - Submitting ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:40.237 INFO MemoryStore - Block broadcast_437 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
18:46:40.238 INFO MemoryStore - Block broadcast_437_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
18:46:40.238 INFO BlockManagerInfo - Added broadcast_437_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:40.238 INFO SparkContext - Created broadcast 437 from broadcast at DAGScheduler.scala:1580
18:46:40.238 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:40.238 INFO TaskSchedulerImpl - Adding task set 219.0 with 1 tasks resource profile 0
18:46:40.239 INFO TaskSetManager - Starting task 0.0 in stage 219.0 (TID 275) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:40.239 INFO Executor - Running task 0.0 in stage 219.0 (TID 275)
18:46:40.250 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam:0+237038
18:46:40.251 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.251 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam dst=null perm=null proto=rpc
18:46:40.252 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b8a1bf11-4972-4e18-b7cf-853584140b3d.bam.bai dst=null perm=null proto=rpc
18:46:40.254 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:40.256 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:40.256 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:40.258 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:40.260 INFO Executor - Finished task 0.0 in stage 219.0 (TID 275). 989 bytes result sent to driver
18:46:40.260 INFO TaskSetManager - Finished task 0.0 in stage 219.0 (TID 275) in 21 ms on localhost (executor driver) (1/1)
18:46:40.261 INFO TaskSchedulerImpl - Removed TaskSet 219.0, whose tasks have all completed, from pool
18:46:40.261 INFO DAGScheduler - ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.030 s
18:46:40.261 INFO DAGScheduler - Job 164 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:40.261 INFO TaskSchedulerImpl - Killing all running tasks in stage 219: Stage finished
18:46:40.261 INFO DAGScheduler - Job 164 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.030036 s
18:46:40.272 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:40.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:40.273 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:40.274 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:40.276 INFO MemoryStore - Block broadcast_438 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:40.282 INFO MemoryStore - Block broadcast_438_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:40.282 INFO BlockManagerInfo - Added broadcast_438_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:40.283 INFO SparkContext - Created broadcast 438 from newAPIHadoopFile at PathSplitSource.java:96
18:46:40.304 INFO MemoryStore - Block broadcast_439 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
18:46:40.310 INFO MemoryStore - Block broadcast_439_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
18:46:40.310 INFO BlockManagerInfo - Added broadcast_439_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:40.310 INFO SparkContext - Created broadcast 439 from newAPIHadoopFile at PathSplitSource.java:96
18:46:40.330 INFO FileInputFormat - Total input files to process : 1
18:46:40.332 INFO MemoryStore - Block broadcast_440 stored as values in memory (estimated size 160.7 KiB, free 1917.5 MiB)
18:46:40.333 INFO MemoryStore - Block broadcast_440_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
18:46:40.333 INFO BlockManagerInfo - Added broadcast_440_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:40.333 INFO SparkContext - Created broadcast 440 from broadcast at ReadsSparkSink.java:133
18:46:40.334 INFO MemoryStore - Block broadcast_441 stored as values in memory (estimated size 163.2 KiB, free 1917.4 MiB)
18:46:40.335 INFO MemoryStore - Block broadcast_441_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
18:46:40.335 INFO BlockManagerInfo - Added broadcast_441_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:40.335 INFO SparkContext - Created broadcast 441 from broadcast at BamSink.java:76
18:46:40.337 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts dst=null perm=null proto=rpc
18:46:40.337 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:40.337 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:40.337 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:40.338 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:40.344 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:40.344 INFO DAGScheduler - Registering RDD 1053 (mapToPair at SparkUtils.java:161) as input to shuffle 44
18:46:40.345 INFO DAGScheduler - Got job 165 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:40.345 INFO DAGScheduler - Final stage: ResultStage 221 (runJob at SparkHadoopWriter.scala:83)
18:46:40.345 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 220)
18:46:40.345 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 220)
18:46:40.345 INFO DAGScheduler - Submitting ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:40.362 INFO MemoryStore - Block broadcast_442 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
18:46:40.363 INFO MemoryStore - Block broadcast_442_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
18:46:40.363 INFO BlockManagerInfo - Added broadcast_442_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.4 MiB)
18:46:40.363 INFO SparkContext - Created broadcast 442 from broadcast at DAGScheduler.scala:1580
18:46:40.364 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:40.364 INFO TaskSchedulerImpl - Adding task set 220.0 with 1 tasks resource profile 0
18:46:40.364 INFO TaskSetManager - Starting task 0.0 in stage 220.0 (TID 276) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:40.364 INFO Executor - Running task 0.0 in stage 220.0 (TID 276)
18:46:40.394 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:40.409 INFO Executor - Finished task 0.0 in stage 220.0 (TID 276). 1148 bytes result sent to driver
18:46:40.410 INFO TaskSetManager - Finished task 0.0 in stage 220.0 (TID 276) in 46 ms on localhost (executor driver) (1/1)
18:46:40.410 INFO TaskSchedulerImpl - Removed TaskSet 220.0, whose tasks have all completed, from pool
18:46:40.410 INFO DAGScheduler - ShuffleMapStage 220 (mapToPair at SparkUtils.java:161) finished in 0.065 s
18:46:40.410 INFO DAGScheduler - looking for newly runnable stages
18:46:40.410 INFO DAGScheduler - running: HashSet()
18:46:40.410 INFO DAGScheduler - waiting: HashSet(ResultStage 221)
18:46:40.410 INFO DAGScheduler - failed: HashSet()
18:46:40.410 INFO DAGScheduler - Submitting ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91), which has no missing parents
18:46:40.419 INFO MemoryStore - Block broadcast_443 stored as values in memory (estimated size 241.5 KiB, free 1916.4 MiB)
18:46:40.419 INFO MemoryStore - Block broadcast_443_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.4 MiB)
18:46:40.419 INFO BlockManagerInfo - Added broadcast_443_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.3 MiB)
18:46:40.420 INFO SparkContext - Created broadcast 443 from broadcast at DAGScheduler.scala:1580
18:46:40.420 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:40.420 INFO TaskSchedulerImpl - Adding task set 221.0 with 1 tasks resource profile 0
18:46:40.420 INFO TaskSetManager - Starting task 0.0 in stage 221.0 (TID 277) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:40.421 INFO Executor - Running task 0.0 in stage 221.0 (TID 277)
18:46:40.425 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:40.425 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:40.436 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:40.436 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:40.436 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:40.436 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:40.436 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:40.436 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:40.437 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:40.438 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:40.439 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:40.442 INFO StateChange - BLOCK* allocate blk_1073741878_1054, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/part-r-00000
18:46:40.443 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741878_1054 src: /127.0.0.1:47334 dest: /127.0.0.1:38019
18:46:40.445 INFO clienttrace - src: /127.0.0.1:47334, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741878_1054, duration(ns): 1613011
18:46:40.446 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741878_1054, type=LAST_IN_PIPELINE terminating
18:46:40.446 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:40.447 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:40.448 INFO StateChange - BLOCK* allocate blk_1073741879_1055, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.sbi
18:46:40.448 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741879_1055 src: /127.0.0.1:47340 dest: /127.0.0.1:38019
18:46:40.449 INFO clienttrace - src: /127.0.0.1:47340, dest: /127.0.0.1:38019, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741879_1055, duration(ns): 447211
18:46:40.449 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741879_1055, type=LAST_IN_PIPELINE terminating
18:46:40.450 INFO FSNamesystem - BLOCK* blk_1073741879_1055 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.sbi
18:46:40.851 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:40.853 INFO StateChange - BLOCK* allocate blk_1073741880_1056, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.bai
18:46:40.853 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741880_1056 src: /127.0.0.1:47350 dest: /127.0.0.1:38019
18:46:40.855 INFO clienttrace - src: /127.0.0.1:47350, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741880_1056, duration(ns): 444607
18:46:40.855 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741880_1056, type=LAST_IN_PIPELINE terminating
18:46:40.855 INFO FSNamesystem - BLOCK* blk_1073741880_1056 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.bai
18:46:41.256 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:41.257 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0 dst=null perm=null proto=rpc
18:46:41.257 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0 dst=null perm=null proto=rpc
18:46:41.258 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/task_202505191846405430872073418813811_1058_r_000000 dst=null perm=null proto=rpc
18:46:41.258 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/_temporary/attempt_202505191846405430872073418813811_1058_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/task_202505191846405430872073418813811_1058_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:41.259 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846405430872073418813811_1058_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/task_202505191846405430872073418813811_1058_r_000000
18:46:41.259 INFO SparkHadoopMapRedUtil - attempt_202505191846405430872073418813811_1058_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:41.259 INFO Executor - Finished task 0.0 in stage 221.0 (TID 277). 1858 bytes result sent to driver
18:46:41.260 INFO TaskSetManager - Finished task 0.0 in stage 221.0 (TID 277) in 839 ms on localhost (executor driver) (1/1)
18:46:41.260 INFO TaskSchedulerImpl - Removed TaskSet 221.0, whose tasks have all completed, from pool
18:46:41.260 INFO DAGScheduler - ResultStage 221 (runJob at SparkHadoopWriter.scala:83) finished in 0.850 s
18:46:41.260 INFO DAGScheduler - Job 165 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:41.260 INFO TaskSchedulerImpl - Killing all running tasks in stage 221: Stage finished
18:46:41.260 INFO DAGScheduler - Job 165 finished: runJob at SparkHadoopWriter.scala:83, took 0.915786 s
18:46:41.260 INFO SparkHadoopWriter - Start to commit write Job job_202505191846405430872073418813811_1058.
18:46:41.261 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:41.261 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts dst=null perm=null proto=rpc
18:46:41.262 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/task_202505191846405430872073418813811_1058_r_000000 dst=null perm=null proto=rpc
18:46:41.262 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:41.263 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/task_202505191846405430872073418813811_1058_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:41.263 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:41.264 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/task_202505191846405430872073418813811_1058_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:41.264 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:41.265 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary/0/task_202505191846405430872073418813811_1058_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:41.265 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:41.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:41.267 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:41.267 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.spark-staging-1058 dst=null perm=null proto=rpc
18:46:41.268 INFO SparkHadoopWriter - Write Job job_202505191846405430872073418813811_1058 committed. Elapsed time: 7 ms.
18:46:41.268 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:41.270 INFO StateChange - BLOCK* allocate blk_1073741881_1057, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/header
18:46:41.270 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741881_1057 src: /127.0.0.1:47352 dest: /127.0.0.1:38019
18:46:41.272 INFO clienttrace - src: /127.0.0.1:47352, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741881_1057, duration(ns): 444229
18:46:41.272 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741881_1057, type=LAST_IN_PIPELINE terminating
18:46:41.272 INFO FSNamesystem - BLOCK* blk_1073741881_1057 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/header
18:46:41.673 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:41.674 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:41.675 INFO StateChange - BLOCK* allocate blk_1073741882_1058, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/terminator
18:46:41.676 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741882_1058 src: /127.0.0.1:47368 dest: /127.0.0.1:38019
18:46:41.677 INFO clienttrace - src: /127.0.0.1:47368, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741882_1058, duration(ns): 455042
18:46:41.677 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741882_1058, type=LAST_IN_PIPELINE terminating
18:46:41.678 INFO FSNamesystem - BLOCK* blk_1073741882_1058 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/terminator
18:46:42.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741872_1048 replica FinalizedReplica, blk_1073741872_1048, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741872 for deletion
18:46:42.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741873_1049 replica FinalizedReplica, blk_1073741873_1049, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741873 for deletion
18:46:42.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741872_1048 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741872
18:46:42.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741873_1049 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741873
18:46:42.079 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:42.080 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts dst=null perm=null proto=rpc
18:46:42.081 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.082 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:42.082 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam
18:46:42.082 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.084 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam done
18:46:42.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.084 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi
18:46:42.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts dst=null perm=null proto=rpc
18:46:42.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:42.087 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:42.088 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
18:46:42.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:42.090 INFO StateChange - BLOCK* allocate blk_1073741883_1059, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi
18:46:42.090 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741883_1059 src: /127.0.0.1:47384 dest: /127.0.0.1:38019
18:46:42.092 INFO clienttrace - src: /127.0.0.1:47384, dest: /127.0.0.1:38019, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741883_1059, duration(ns): 508589
18:46:42.092 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741883_1059, type=LAST_IN_PIPELINE terminating
18:46:42.092 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:42.093 INFO IndexFileMerger - Done merging .sbi files
18:46:42.093 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai
18:46:42.093 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts dst=null perm=null proto=rpc
18:46:42.094 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.094 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:42.095 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:42.096 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:42.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:42.097 INFO StateChange - BLOCK* allocate blk_1073741884_1060, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai
18:46:42.098 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741884_1060 src: /127.0.0.1:47386 dest: /127.0.0.1:38019
18:46:42.099 INFO clienttrace - src: /127.0.0.1:47386, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741884_1060, duration(ns): 364558
18:46:42.099 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741884_1060, type=LAST_IN_PIPELINE terminating
18:46:42.100 INFO FSNamesystem - BLOCK* blk_1073741884_1060 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai
18:46:42.501 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:42.501 INFO IndexFileMerger - Done merging .bai files
18:46:42.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.parts dst=null perm=null proto=rpc
18:46:42.510 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.518 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi dst=null perm=null proto=rpc
18:46:42.518 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi dst=null perm=null proto=rpc
18:46:42.519 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi dst=null perm=null proto=rpc
18:46:42.520 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
18:46:42.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.521 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.521 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.522 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.522 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.523 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.524 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:42.525 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:42.526 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:42.526 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:42.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi dst=null perm=null proto=rpc
18:46:42.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi dst=null perm=null proto=rpc
18:46:42.527 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.sbi dst=null perm=null proto=rpc
18:46:42.528 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
18:46:42.528 INFO MemoryStore - Block broadcast_444 stored as values in memory (estimated size 13.3 KiB, free 1916.4 MiB)
18:46:42.529 INFO MemoryStore - Block broadcast_444_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1916.3 MiB)
18:46:42.529 INFO BlockManagerInfo - Added broadcast_444_piece0 in memory on localhost:45727 (size: 8.3 KiB, free: 1919.3 MiB)
18:46:42.529 INFO SparkContext - Created broadcast 444 from broadcast at BamSource.java:104
18:46:42.530 INFO MemoryStore - Block broadcast_445 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
18:46:42.536 INFO MemoryStore - Block broadcast_445_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
18:46:42.536 INFO BlockManagerInfo - Added broadcast_445_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:42.536 INFO SparkContext - Created broadcast 445 from newAPIHadoopFile at PathSplitSource.java:96
18:46:42.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.545 INFO FileInputFormat - Total input files to process : 1
18:46:42.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.560 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:42.560 INFO DAGScheduler - Got job 166 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:42.560 INFO DAGScheduler - Final stage: ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:42.560 INFO DAGScheduler - Parents of final stage: List()
18:46:42.560 INFO DAGScheduler - Missing parents: List()
18:46:42.560 INFO DAGScheduler - Submitting ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:42.566 INFO MemoryStore - Block broadcast_446 stored as values in memory (estimated size 148.2 KiB, free 1915.9 MiB)
18:46:42.567 INFO MemoryStore - Block broadcast_446_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.8 MiB)
18:46:42.567 INFO BlockManagerInfo - Added broadcast_446_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.2 MiB)
18:46:42.567 INFO SparkContext - Created broadcast 446 from broadcast at DAGScheduler.scala:1580
18:46:42.567 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:42.567 INFO TaskSchedulerImpl - Adding task set 222.0 with 1 tasks resource profile 0
18:46:42.568 INFO TaskSetManager - Starting task 0.0 in stage 222.0 (TID 278) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:42.568 INFO Executor - Running task 0.0 in stage 222.0 (TID 278)
18:46:42.579 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam:0+237038
18:46:42.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.582 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.584 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:42.586 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:42.586 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:42.588 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:42.588 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:42.590 INFO Executor - Finished task 0.0 in stage 222.0 (TID 278). 651483 bytes result sent to driver
18:46:42.592 INFO TaskSetManager - Finished task 0.0 in stage 222.0 (TID 278) in 24 ms on localhost (executor driver) (1/1)
18:46:42.592 INFO TaskSchedulerImpl - Removed TaskSet 222.0, whose tasks have all completed, from pool
18:46:42.592 INFO DAGScheduler - ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.031 s
18:46:42.592 INFO DAGScheduler - Job 166 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:42.592 INFO TaskSchedulerImpl - Killing all running tasks in stage 222: Stage finished
18:46:42.592 INFO DAGScheduler - Job 166 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.032286 s
18:46:42.607 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:42.607 INFO DAGScheduler - Got job 167 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:42.607 INFO DAGScheduler - Final stage: ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185)
18:46:42.607 INFO DAGScheduler - Parents of final stage: List()
18:46:42.608 INFO DAGScheduler - Missing parents: List()
18:46:42.608 INFO DAGScheduler - Submitting ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:42.624 INFO MemoryStore - Block broadcast_447 stored as values in memory (estimated size 426.1 KiB, free 1915.4 MiB)
18:46:42.626 INFO MemoryStore - Block broadcast_447_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.2 MiB)
18:46:42.626 INFO BlockManagerInfo - Added broadcast_447_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.0 MiB)
18:46:42.626 INFO SparkContext - Created broadcast 447 from broadcast at DAGScheduler.scala:1580
18:46:42.626 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:42.626 INFO TaskSchedulerImpl - Adding task set 223.0 with 1 tasks resource profile 0
18:46:42.627 INFO TaskSetManager - Starting task 0.0 in stage 223.0 (TID 279) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:42.627 INFO Executor - Running task 0.0 in stage 223.0 (TID 279)
18:46:42.656 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:42.666 INFO Executor - Finished task 0.0 in stage 223.0 (TID 279). 989 bytes result sent to driver
18:46:42.667 INFO TaskSetManager - Finished task 0.0 in stage 223.0 (TID 279) in 41 ms on localhost (executor driver) (1/1)
18:46:42.667 INFO TaskSchedulerImpl - Removed TaskSet 223.0, whose tasks have all completed, from pool
18:46:42.667 INFO DAGScheduler - ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
18:46:42.667 INFO DAGScheduler - Job 167 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:42.667 INFO TaskSchedulerImpl - Killing all running tasks in stage 223: Stage finished
18:46:42.667 INFO DAGScheduler - Job 167 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059839 s
18:46:42.671 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:42.671 INFO DAGScheduler - Got job 168 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:42.671 INFO DAGScheduler - Final stage: ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185)
18:46:42.671 INFO DAGScheduler - Parents of final stage: List()
18:46:42.671 INFO DAGScheduler - Missing parents: List()
18:46:42.671 INFO DAGScheduler - Submitting ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:42.677 INFO MemoryStore - Block broadcast_448 stored as values in memory (estimated size 148.1 KiB, free 1915.1 MiB)
18:46:42.683 INFO MemoryStore - Block broadcast_448_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.1 MiB)
18:46:42.683 INFO BlockManagerInfo - Removed broadcast_439_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:42.683 INFO BlockManagerInfo - Added broadcast_448_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.0 MiB)
18:46:42.683 INFO SparkContext - Created broadcast 448 from broadcast at DAGScheduler.scala:1580
18:46:42.683 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:42.683 INFO TaskSchedulerImpl - Adding task set 224.0 with 1 tasks resource profile 0
18:46:42.683 INFO BlockManagerInfo - Removed broadcast_435_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.1 MiB)
18:46:42.684 INFO BlockManagerInfo - Removed broadcast_437_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.1 MiB)
18:46:42.684 INFO TaskSetManager - Starting task 0.0 in stage 224.0 (TID 280) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:42.684 INFO Executor - Running task 0.0 in stage 224.0 (TID 280)
18:46:42.684 INFO BlockManagerInfo - Removed broadcast_433_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.1 MiB)
18:46:42.685 INFO BlockManagerInfo - Removed broadcast_446_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.2 MiB)
18:46:42.685 INFO BlockManagerInfo - Removed broadcast_436_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:42.686 INFO BlockManagerInfo - Removed broadcast_442_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.5 MiB)
18:46:42.686 INFO BlockManagerInfo - Removed broadcast_443_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.6 MiB)
18:46:42.687 INFO BlockManagerInfo - Removed broadcast_427_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.6 MiB)
18:46:42.687 INFO BlockManagerInfo - Removed broadcast_434_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:42.687 INFO BlockManagerInfo - Removed broadcast_447_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.8 MiB)
18:46:42.688 INFO BlockManagerInfo - Removed broadcast_440_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:42.688 INFO BlockManagerInfo - Removed broadcast_441_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:42.697 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam:0+237038
18:46:42.697 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.698 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam dst=null perm=null proto=rpc
18:46:42.699 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.700 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.700 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7adec3e8-3843-4315-a227-ebc0b62b8c7c.bam.bai dst=null perm=null proto=rpc
18:46:42.701 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:42.704 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:42.705 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:42.705 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:42.706 INFO Executor - Finished task 0.0 in stage 224.0 (TID 280). 989 bytes result sent to driver
18:46:42.707 INFO TaskSetManager - Finished task 0.0 in stage 224.0 (TID 280) in 23 ms on localhost (executor driver) (1/1)
18:46:42.707 INFO TaskSchedulerImpl - Removed TaskSet 224.0, whose tasks have all completed, from pool
18:46:42.707 INFO DAGScheduler - ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.036 s
18:46:42.707 INFO DAGScheduler - Job 168 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:42.707 INFO TaskSchedulerImpl - Killing all running tasks in stage 224: Stage finished
18:46:42.707 INFO DAGScheduler - Job 168 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.036038 s
18:46:42.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:42.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.716 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:42.717 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:42.719 INFO MemoryStore - Block broadcast_449 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
18:46:42.725 INFO MemoryStore - Block broadcast_449_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
18:46:42.725 INFO BlockManagerInfo - Added broadcast_449_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.8 MiB)
18:46:42.725 INFO SparkContext - Created broadcast 449 from newAPIHadoopFile at PathSplitSource.java:96
18:46:42.747 INFO MemoryStore - Block broadcast_450 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
18:46:42.753 INFO MemoryStore - Block broadcast_450_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
18:46:42.753 INFO BlockManagerInfo - Added broadcast_450_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:42.753 INFO SparkContext - Created broadcast 450 from newAPIHadoopFile at PathSplitSource.java:96
18:46:42.773 INFO FileInputFormat - Total input files to process : 1
18:46:42.774 INFO MemoryStore - Block broadcast_451 stored as values in memory (estimated size 160.7 KiB, free 1918.3 MiB)
18:46:42.775 INFO MemoryStore - Block broadcast_451_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.3 MiB)
18:46:42.775 INFO BlockManagerInfo - Added broadcast_451_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.7 MiB)
18:46:42.775 INFO SparkContext - Created broadcast 451 from broadcast at ReadsSparkSink.java:133
18:46:42.775 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:42.776 INFO MemoryStore - Block broadcast_452 stored as values in memory (estimated size 163.2 KiB, free 1918.1 MiB)
18:46:42.777 INFO MemoryStore - Block broadcast_452_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.1 MiB)
18:46:42.777 INFO BlockManagerInfo - Added broadcast_452_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.7 MiB)
18:46:42.777 INFO SparkContext - Created broadcast 452 from broadcast at BamSink.java:76
18:46:42.779 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts dst=null perm=null proto=rpc
18:46:42.779 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:42.779 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:42.779 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:42.780 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:42.786 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:42.786 INFO DAGScheduler - Registering RDD 1078 (mapToPair at SparkUtils.java:161) as input to shuffle 45
18:46:42.787 INFO DAGScheduler - Got job 169 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:42.787 INFO DAGScheduler - Final stage: ResultStage 226 (runJob at SparkHadoopWriter.scala:83)
18:46:42.787 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 225)
18:46:42.787 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 225)
18:46:42.787 INFO DAGScheduler - Submitting ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:42.804 INFO MemoryStore - Block broadcast_453 stored as values in memory (estimated size 520.4 KiB, free 1917.6 MiB)
18:46:42.806 INFO MemoryStore - Block broadcast_453_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.4 MiB)
18:46:42.806 INFO BlockManagerInfo - Added broadcast_453_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.6 MiB)
18:46:42.806 INFO SparkContext - Created broadcast 453 from broadcast at DAGScheduler.scala:1580
18:46:42.806 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:42.806 INFO TaskSchedulerImpl - Adding task set 225.0 with 1 tasks resource profile 0
18:46:42.806 INFO TaskSetManager - Starting task 0.0 in stage 225.0 (TID 281) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:42.807 INFO Executor - Running task 0.0 in stage 225.0 (TID 281)
18:46:42.836 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:42.851 INFO Executor - Finished task 0.0 in stage 225.0 (TID 281). 1148 bytes result sent to driver
18:46:42.851 INFO TaskSetManager - Finished task 0.0 in stage 225.0 (TID 281) in 45 ms on localhost (executor driver) (1/1)
18:46:42.851 INFO TaskSchedulerImpl - Removed TaskSet 225.0, whose tasks have all completed, from pool
18:46:42.851 INFO DAGScheduler - ShuffleMapStage 225 (mapToPair at SparkUtils.java:161) finished in 0.064 s
18:46:42.851 INFO DAGScheduler - looking for newly runnable stages
18:46:42.851 INFO DAGScheduler - running: HashSet()
18:46:42.851 INFO DAGScheduler - waiting: HashSet(ResultStage 226)
18:46:42.851 INFO DAGScheduler - failed: HashSet()
18:46:42.852 INFO DAGScheduler - Submitting ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91), which has no missing parents
18:46:42.859 INFO MemoryStore - Block broadcast_454 stored as values in memory (estimated size 241.5 KiB, free 1917.2 MiB)
18:46:42.860 INFO MemoryStore - Block broadcast_454_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1917.1 MiB)
18:46:42.860 INFO BlockManagerInfo - Added broadcast_454_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.5 MiB)
18:46:42.860 INFO SparkContext - Created broadcast 454 from broadcast at DAGScheduler.scala:1580
18:46:42.860 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:42.860 INFO TaskSchedulerImpl - Adding task set 226.0 with 1 tasks resource profile 0
18:46:42.861 INFO TaskSetManager - Starting task 0.0 in stage 226.0 (TID 282) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:42.861 INFO Executor - Running task 0.0 in stage 226.0 (TID 282)
18:46:42.867 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:42.867 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:42.881 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:42.881 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:42.881 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:42.881 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:42.881 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:42.881 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:42.882 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.883 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:42.885 INFO StateChange - BLOCK* allocate blk_1073741885_1061, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/part-r-00000
18:46:42.886 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741885_1061 src: /127.0.0.1:47418 dest: /127.0.0.1:38019
18:46:42.888 INFO clienttrace - src: /127.0.0.1:47418, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741885_1061, duration(ns): 1209660
18:46:42.888 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741885_1061, type=LAST_IN_PIPELINE terminating
18:46:42.889 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:42.889 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:42.891 INFO StateChange - BLOCK* allocate blk_1073741886_1062, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/.part-r-00000.bai
18:46:42.892 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741886_1062 src: /127.0.0.1:47426 dest: /127.0.0.1:38019
18:46:42.893 INFO clienttrace - src: /127.0.0.1:47426, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741886_1062, duration(ns): 387884
18:46:42.893 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741886_1062, type=LAST_IN_PIPELINE terminating
18:46:42.893 INFO FSNamesystem - BLOCK* blk_1073741886_1062 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/.part-r-00000.bai
18:46:43.294 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:43.295 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0 dst=null perm=null proto=rpc
18:46:43.296 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0 dst=null perm=null proto=rpc
18:46:43.296 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/task_202505191846427637641182851198301_1083_r_000000 dst=null perm=null proto=rpc
18:46:43.297 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/_temporary/attempt_202505191846427637641182851198301_1083_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/task_202505191846427637641182851198301_1083_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:43.297 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846427637641182851198301_1083_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/task_202505191846427637641182851198301_1083_r_000000
18:46:43.297 INFO SparkHadoopMapRedUtil - attempt_202505191846427637641182851198301_1083_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:43.297 INFO Executor - Finished task 0.0 in stage 226.0 (TID 282). 1858 bytes result sent to driver
18:46:43.298 INFO TaskSetManager - Finished task 0.0 in stage 226.0 (TID 282) in 437 ms on localhost (executor driver) (1/1)
18:46:43.298 INFO TaskSchedulerImpl - Removed TaskSet 226.0, whose tasks have all completed, from pool
18:46:43.298 INFO DAGScheduler - ResultStage 226 (runJob at SparkHadoopWriter.scala:83) finished in 0.446 s
18:46:43.298 INFO DAGScheduler - Job 169 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:43.298 INFO TaskSchedulerImpl - Killing all running tasks in stage 226: Stage finished
18:46:43.298 INFO DAGScheduler - Job 169 finished: runJob at SparkHadoopWriter.scala:83, took 0.511958 s
18:46:43.298 INFO SparkHadoopWriter - Start to commit write Job job_202505191846427637641182851198301_1083.
18:46:43.299 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:43.299 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts dst=null perm=null proto=rpc
18:46:43.300 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/task_202505191846427637641182851198301_1083_r_000000 dst=null perm=null proto=rpc
18:46:43.300 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:43.301 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/task_202505191846427637641182851198301_1083_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:43.301 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:43.301 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary/0/task_202505191846427637641182851198301_1083_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:43.302 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:43.303 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:43.303 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:43.304 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/.spark-staging-1083 dst=null perm=null proto=rpc
18:46:43.304 INFO SparkHadoopWriter - Write Job job_202505191846427637641182851198301_1083 committed. Elapsed time: 5 ms.
18:46:43.305 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:43.306 INFO StateChange - BLOCK* allocate blk_1073741887_1063, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/header
18:46:43.307 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741887_1063 src: /127.0.0.1:47442 dest: /127.0.0.1:38019
18:46:43.308 INFO clienttrace - src: /127.0.0.1:47442, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741887_1063, duration(ns): 458315
18:46:43.308 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741887_1063, type=LAST_IN_PIPELINE terminating
18:46:43.309 INFO FSNamesystem - BLOCK* blk_1073741887_1063 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/header
18:46:43.710 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:43.711 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:43.712 INFO StateChange - BLOCK* allocate blk_1073741888_1064, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/terminator
18:46:43.713 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741888_1064 src: /127.0.0.1:47456 dest: /127.0.0.1:38019
18:46:43.714 INFO clienttrace - src: /127.0.0.1:47456, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741888_1064, duration(ns): 419132
18:46:43.714 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741888_1064, type=LAST_IN_PIPELINE terminating
18:46:43.715 INFO FSNamesystem - BLOCK* blk_1073741888_1064 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/terminator
18:46:44.116 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:44.117 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts dst=null perm=null proto=rpc
18:46:44.118 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:44.118 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:44.119 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam
18:46:44.119 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:44.119 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.120 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.120 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:44.121 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam done
18:46:44.121 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.121 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai
18:46:44.121 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts dst=null perm=null proto=rpc
18:46:44.122 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:44.123 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:44.123 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:44.124 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:44.125 INFO StateChange - BLOCK* allocate blk_1073741889_1065, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai
18:46:44.126 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741889_1065 src: /127.0.0.1:47472 dest: /127.0.0.1:38019
18:46:44.127 INFO clienttrace - src: /127.0.0.1:47472, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741889_1065, duration(ns): 534776
18:46:44.127 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741889_1065, type=LAST_IN_PIPELINE terminating
18:46:44.128 INFO FSNamesystem - BLOCK* blk_1073741889_1065 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai
18:46:44.529 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:44.529 INFO IndexFileMerger - Done merging .bai files
18:46:44.530 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.parts dst=null perm=null proto=rpc
18:46:44.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.541 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.541 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.544 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.545 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.545 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.sbi dst=null perm=null proto=rpc
18:46:44.546 INFO MemoryStore - Block broadcast_455 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
18:46:44.552 INFO MemoryStore - Block broadcast_455_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
18:46:44.552 INFO BlockManagerInfo - Added broadcast_455_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:44.552 INFO SparkContext - Created broadcast 455 from newAPIHadoopFile at PathSplitSource.java:96
18:46:44.572 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.572 INFO FileInputFormat - Total input files to process : 1
18:46:44.573 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.608 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:44.609 INFO DAGScheduler - Got job 170 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:44.609 INFO DAGScheduler - Final stage: ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:44.609 INFO DAGScheduler - Parents of final stage: List()
18:46:44.609 INFO DAGScheduler - Missing parents: List()
18:46:44.609 INFO DAGScheduler - Submitting ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:44.626 INFO MemoryStore - Block broadcast_456 stored as values in memory (estimated size 426.2 KiB, free 1916.4 MiB)
18:46:44.627 INFO MemoryStore - Block broadcast_456_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.2 MiB)
18:46:44.627 INFO BlockManagerInfo - Added broadcast_456_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.3 MiB)
18:46:44.627 INFO SparkContext - Created broadcast 456 from broadcast at DAGScheduler.scala:1580
18:46:44.627 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:44.628 INFO TaskSchedulerImpl - Adding task set 227.0 with 1 tasks resource profile 0
18:46:44.628 INFO TaskSetManager - Starting task 0.0 in stage 227.0 (TID 283) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:44.628 INFO Executor - Running task 0.0 in stage 227.0 (TID 283)
18:46:44.657 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam:0+237038
18:46:44.658 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.658 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.659 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.660 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.660 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.661 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.661 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.662 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.663 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.665 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.665 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.665 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.666 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.666 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.667 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.671 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.671 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.672 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.673 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.673 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.674 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.675 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.676 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.676 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.677 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.678 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.678 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.679 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.679 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.680 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.681 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.682 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.683 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.683 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.684 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.685 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.686 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.686 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.687 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.688 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.688 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.689 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.690 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.691 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.691 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.692 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.693 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.693 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.695 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.696 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.697 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.698 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.698 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.699 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.699 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.700 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.701 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.702 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.703 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.703 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.704 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.704 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.705 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.706 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.707 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.707 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.708 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.709 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.710 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.711 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.711 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.713 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.713 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.714 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.715 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.716 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.716 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.717 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.717 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.718 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.718 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.720 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.721 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.722 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.723 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.723 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:44.725 INFO Executor - Finished task 0.0 in stage 227.0 (TID 283). 651483 bytes result sent to driver
18:46:44.727 INFO TaskSetManager - Finished task 0.0 in stage 227.0 (TID 283) in 99 ms on localhost (executor driver) (1/1)
18:46:44.727 INFO TaskSchedulerImpl - Removed TaskSet 227.0, whose tasks have all completed, from pool
18:46:44.727 INFO DAGScheduler - ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.118 s
18:46:44.727 INFO DAGScheduler - Job 170 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:44.727 INFO TaskSchedulerImpl - Killing all running tasks in stage 227: Stage finished
18:46:44.728 INFO DAGScheduler - Job 170 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.119064 s
18:46:44.737 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:44.737 INFO DAGScheduler - Got job 171 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:44.737 INFO DAGScheduler - Final stage: ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185)
18:46:44.737 INFO DAGScheduler - Parents of final stage: List()
18:46:44.737 INFO DAGScheduler - Missing parents: List()
18:46:44.737 INFO DAGScheduler - Submitting ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:44.754 INFO MemoryStore - Block broadcast_457 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
18:46:44.755 INFO MemoryStore - Block broadcast_457_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.6 MiB)
18:46:44.755 INFO BlockManagerInfo - Added broadcast_457_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.1 MiB)
18:46:44.756 INFO SparkContext - Created broadcast 457 from broadcast at DAGScheduler.scala:1580
18:46:44.756 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:44.756 INFO TaskSchedulerImpl - Adding task set 228.0 with 1 tasks resource profile 0
18:46:44.756 INFO TaskSetManager - Starting task 0.0 in stage 228.0 (TID 284) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:44.756 INFO Executor - Running task 0.0 in stage 228.0 (TID 284)
18:46:44.791 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:44.801 INFO Executor - Finished task 0.0 in stage 228.0 (TID 284). 989 bytes result sent to driver
18:46:44.802 INFO TaskSetManager - Finished task 0.0 in stage 228.0 (TID 284) in 46 ms on localhost (executor driver) (1/1)
18:46:44.802 INFO TaskSchedulerImpl - Removed TaskSet 228.0, whose tasks have all completed, from pool
18:46:44.802 INFO DAGScheduler - ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
18:46:44.802 INFO DAGScheduler - Job 171 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:44.802 INFO TaskSchedulerImpl - Killing all running tasks in stage 228: Stage finished
18:46:44.802 INFO DAGScheduler - Job 171 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065094 s
18:46:44.805 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:44.806 INFO DAGScheduler - Got job 172 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:44.806 INFO DAGScheduler - Final stage: ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185)
18:46:44.806 INFO DAGScheduler - Parents of final stage: List()
18:46:44.806 INFO DAGScheduler - Missing parents: List()
18:46:44.806 INFO DAGScheduler - Submitting ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:44.828 INFO MemoryStore - Block broadcast_458 stored as values in memory (estimated size 426.1 KiB, free 1915.2 MiB)
18:46:44.830 INFO MemoryStore - Block broadcast_458_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
18:46:44.830 INFO BlockManagerInfo - Added broadcast_458_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.0 MiB)
18:46:44.830 INFO SparkContext - Created broadcast 458 from broadcast at DAGScheduler.scala:1580
18:46:44.830 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:44.830 INFO TaskSchedulerImpl - Adding task set 229.0 with 1 tasks resource profile 0
18:46:44.831 INFO TaskSetManager - Starting task 0.0 in stage 229.0 (TID 285) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:44.831 INFO Executor - Running task 0.0 in stage 229.0 (TID 285)
18:46:44.862 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam:0+237038
18:46:44.863 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.863 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.864 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.865 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.865 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.866 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.866 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.867 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.868 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.870 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.870 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.871 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.871 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.872 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.876 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.876 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.877 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.878 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.878 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.880 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.880 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.881 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.882 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.883 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.883 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.884 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.884 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.886 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.886 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.887 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.888 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.888 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.889 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.890 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.891 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.891 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.892 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.893 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.894 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.894 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.902 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.903 INFO BlockManagerInfo - Removed broadcast_454_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.1 MiB)
18:46:44.903 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.903 INFO BlockManagerInfo - Removed broadcast_445_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:44.904 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.904 INFO BlockManagerInfo - Removed broadcast_456_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:44.904 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.905 INFO BlockManagerInfo - Removed broadcast_457_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.4 MiB)
18:46:44.905 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.905 INFO BlockManagerInfo - Removed broadcast_450_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:44.906 INFO BlockManagerInfo - Removed broadcast_451_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:44.906 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.907 INFO BlockManagerInfo - Removed broadcast_444_piece0 on localhost:45727 in memory (size: 8.3 KiB, free: 1919.5 MiB)
18:46:44.907 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.907 INFO BlockManagerInfo - Removed broadcast_448_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.5 MiB)
18:46:44.907 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.908 INFO BlockManagerInfo - Removed broadcast_453_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.7 MiB)
18:46:44.908 INFO BlockManagerInfo - Removed broadcast_452_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:44.908 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.909 INFO BlockManagerInfo - Removed broadcast_438_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:44.909 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.910 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.910 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.911 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.911 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.912 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.913 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.914 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.914 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.915 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.916 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.917 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.917 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.918 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.919 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.919 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.920 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.921 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.922 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.922 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.923 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.924 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.924 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.925 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.926 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.927 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.928 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:44.928 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.928 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.929 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.930 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam dst=null perm=null proto=rpc
18:46:44.930 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.931 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.931 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_32605581-0f97-4f66-858b-5fd14fb681d9.bam.bai dst=null perm=null proto=rpc
18:46:44.933 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:44.934 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.935 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:44.936 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:44.937 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:44.939 INFO Executor - Finished task 0.0 in stage 229.0 (TID 285). 1032 bytes result sent to driver
18:46:44.939 INFO TaskSetManager - Finished task 0.0 in stage 229.0 (TID 285) in 108 ms on localhost (executor driver) (1/1)
18:46:44.939 INFO TaskSchedulerImpl - Removed TaskSet 229.0, whose tasks have all completed, from pool
18:46:44.939 INFO DAGScheduler - ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.133 s
18:46:44.939 INFO DAGScheduler - Job 172 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:44.939 INFO TaskSchedulerImpl - Killing all running tasks in stage 229: Stage finished
18:46:44.939 INFO DAGScheduler - Job 172 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.134013 s
18:46:44.951 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:44.951 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:44.952 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:44.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:44.955 INFO MemoryStore - Block broadcast_459 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
18:46:44.961 INFO MemoryStore - Block broadcast_459_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
18:46:44.961 INFO BlockManagerInfo - Added broadcast_459_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:44.962 INFO SparkContext - Created broadcast 459 from newAPIHadoopFile at PathSplitSource.java:96
18:46:44.983 INFO MemoryStore - Block broadcast_460 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:44.990 INFO MemoryStore - Block broadcast_460_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
18:46:44.990 INFO BlockManagerInfo - Added broadcast_460_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:44.990 INFO SparkContext - Created broadcast 460 from newAPIHadoopFile at PathSplitSource.java:96
18:46:45.011 INFO FileInputFormat - Total input files to process : 1
18:46:45.013 INFO MemoryStore - Block broadcast_461 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
18:46:45.014 INFO MemoryStore - Block broadcast_461_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
18:46:45.014 INFO BlockManagerInfo - Added broadcast_461_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:45.014 INFO SparkContext - Created broadcast 461 from broadcast at ReadsSparkSink.java:133
18:46:45.014 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:45.016 INFO MemoryStore - Block broadcast_462 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
18:46:45.016 INFO MemoryStore - Block broadcast_462_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
18:46:45.016 INFO BlockManagerInfo - Added broadcast_462_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:45.017 INFO SparkContext - Created broadcast 462 from broadcast at BamSink.java:76
18:46:45.019 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts dst=null perm=null proto=rpc
18:46:45.019 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:45.019 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:45.019 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:45.020 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:45.026 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:45.026 INFO DAGScheduler - Registering RDD 1104 (mapToPair at SparkUtils.java:161) as input to shuffle 46
18:46:45.027 INFO DAGScheduler - Got job 173 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:45.027 INFO DAGScheduler - Final stage: ResultStage 231 (runJob at SparkHadoopWriter.scala:83)
18:46:45.027 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 230)
18:46:45.027 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 230)
18:46:45.027 INFO DAGScheduler - Submitting ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:45.045 INFO MemoryStore - Block broadcast_463 stored as values in memory (estimated size 520.4 KiB, free 1917.2 MiB)
18:46:45.046 INFO MemoryStore - Block broadcast_463_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.1 MiB)
18:46:45.046 INFO BlockManagerInfo - Added broadcast_463_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.5 MiB)
18:46:45.046 INFO SparkContext - Created broadcast 463 from broadcast at DAGScheduler.scala:1580
18:46:45.046 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:45.046 INFO TaskSchedulerImpl - Adding task set 230.0 with 1 tasks resource profile 0
18:46:45.047 INFO TaskSetManager - Starting task 0.0 in stage 230.0 (TID 286) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:45.047 INFO Executor - Running task 0.0 in stage 230.0 (TID 286)
18:46:45.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741879_1055 replica FinalizedReplica, blk_1073741879_1055, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741879 for deletion
18:46:45.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741880_1056 replica FinalizedReplica, blk_1073741880_1056, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741880 for deletion
18:46:45.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741886_1062 replica FinalizedReplica, blk_1073741886_1062, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741886 for deletion
18:46:45.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741879_1055 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741879
18:46:45.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741880_1056 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741880
18:46:45.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741886_1062 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741886
18:46:45.077 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:45.092 INFO Executor - Finished task 0.0 in stage 230.0 (TID 286). 1148 bytes result sent to driver
18:46:45.093 INFO TaskSetManager - Finished task 0.0 in stage 230.0 (TID 286) in 46 ms on localhost (executor driver) (1/1)
18:46:45.093 INFO TaskSchedulerImpl - Removed TaskSet 230.0, whose tasks have all completed, from pool
18:46:45.093 INFO DAGScheduler - ShuffleMapStage 230 (mapToPair at SparkUtils.java:161) finished in 0.066 s
18:46:45.093 INFO DAGScheduler - looking for newly runnable stages
18:46:45.093 INFO DAGScheduler - running: HashSet()
18:46:45.093 INFO DAGScheduler - waiting: HashSet(ResultStage 231)
18:46:45.093 INFO DAGScheduler - failed: HashSet()
18:46:45.093 INFO DAGScheduler - Submitting ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91), which has no missing parents
18:46:45.100 INFO MemoryStore - Block broadcast_464 stored as values in memory (estimated size 241.5 KiB, free 1916.8 MiB)
18:46:45.101 INFO MemoryStore - Block broadcast_464_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.8 MiB)
18:46:45.101 INFO BlockManagerInfo - Added broadcast_464_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.4 MiB)
18:46:45.101 INFO SparkContext - Created broadcast 464 from broadcast at DAGScheduler.scala:1580
18:46:45.101 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:45.101 INFO TaskSchedulerImpl - Adding task set 231.0 with 1 tasks resource profile 0
18:46:45.102 INFO TaskSetManager - Starting task 0.0 in stage 231.0 (TID 287) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:45.102 INFO Executor - Running task 0.0 in stage 231.0 (TID 287)
18:46:45.106 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:45.106 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:45.118 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:45.118 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:45.118 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:45.118 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:45.118 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:45.118 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:45.120 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.121 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.124 INFO StateChange - BLOCK* allocate blk_1073741890_1066, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/part-r-00000
18:46:45.125 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741890_1066 src: /127.0.0.1:48166 dest: /127.0.0.1:38019
18:46:45.128 INFO clienttrace - src: /127.0.0.1:48166, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741890_1066, duration(ns): 1571947
18:46:45.128 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741890_1066, type=LAST_IN_PIPELINE terminating
18:46:45.129 INFO FSNamesystem - BLOCK* blk_1073741890_1066 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/part-r-00000
18:46:45.530 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:45.530 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:45.531 INFO StateChange - BLOCK* allocate blk_1073741891_1067, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/.part-r-00000.sbi
18:46:45.532 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741891_1067 src: /127.0.0.1:48182 dest: /127.0.0.1:38019
18:46:45.533 INFO clienttrace - src: /127.0.0.1:48182, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741891_1067, duration(ns): 390679
18:46:45.533 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741891_1067, type=LAST_IN_PIPELINE terminating
18:46:45.534 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:45.535 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0 dst=null perm=null proto=rpc
18:46:45.535 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0 dst=null perm=null proto=rpc
18:46:45.536 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/task_202505191846456966200911099982838_1109_r_000000 dst=null perm=null proto=rpc
18:46:45.536 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/_temporary/attempt_202505191846456966200911099982838_1109_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/task_202505191846456966200911099982838_1109_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:45.536 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846456966200911099982838_1109_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/task_202505191846456966200911099982838_1109_r_000000
18:46:45.536 INFO SparkHadoopMapRedUtil - attempt_202505191846456966200911099982838_1109_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:45.537 INFO Executor - Finished task 0.0 in stage 231.0 (TID 287). 1858 bytes result sent to driver
18:46:45.537 INFO TaskSetManager - Finished task 0.0 in stage 231.0 (TID 287) in 435 ms on localhost (executor driver) (1/1)
18:46:45.537 INFO TaskSchedulerImpl - Removed TaskSet 231.0, whose tasks have all completed, from pool
18:46:45.537 INFO DAGScheduler - ResultStage 231 (runJob at SparkHadoopWriter.scala:83) finished in 0.444 s
18:46:45.537 INFO DAGScheduler - Job 173 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:45.537 INFO TaskSchedulerImpl - Killing all running tasks in stage 231: Stage finished
18:46:45.538 INFO DAGScheduler - Job 173 finished: runJob at SparkHadoopWriter.scala:83, took 0.511511 s
18:46:45.538 INFO SparkHadoopWriter - Start to commit write Job job_202505191846456966200911099982838_1109.
18:46:45.538 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:45.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts dst=null perm=null proto=rpc
18:46:45.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/task_202505191846456966200911099982838_1109_r_000000 dst=null perm=null proto=rpc
18:46:45.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:45.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/task_202505191846456966200911099982838_1109_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:45.541 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary/0/task_202505191846456966200911099982838_1109_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.541 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:45.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.543 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:45.543 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/.spark-staging-1109 dst=null perm=null proto=rpc
18:46:45.543 INFO SparkHadoopWriter - Write Job job_202505191846456966200911099982838_1109 committed. Elapsed time: 5 ms.
18:46:45.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.545 INFO StateChange - BLOCK* allocate blk_1073741892_1068, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/header
18:46:45.546 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741892_1068 src: /127.0.0.1:48192 dest: /127.0.0.1:38019
18:46:45.547 INFO clienttrace - src: /127.0.0.1:48192, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741892_1068, duration(ns): 405867
18:46:45.547 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741892_1068, type=LAST_IN_PIPELINE terminating
18:46:45.547 INFO FSNamesystem - BLOCK* blk_1073741892_1068 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/header
18:46:45.948 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:45.949 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.950 INFO StateChange - BLOCK* allocate blk_1073741893_1069, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/terminator
18:46:45.951 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741893_1069 src: /127.0.0.1:48194 dest: /127.0.0.1:38019
18:46:45.952 INFO clienttrace - src: /127.0.0.1:48194, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741893_1069, duration(ns): 373728
18:46:45.952 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741893_1069, type=LAST_IN_PIPELINE terminating
18:46:45.952 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:45.953 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts dst=null perm=null proto=rpc
18:46:45.954 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.954 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:45.954 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam
18:46:45.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:45.956 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:45.956 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.956 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam done
18:46:45.957 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:45.957 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi
18:46:45.957 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts dst=null perm=null proto=rpc
18:46:45.958 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:45.958 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:45.959 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:45.960 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:45.960 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:45.961 INFO StateChange - BLOCK* allocate blk_1073741894_1070, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi
18:46:45.961 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741894_1070 src: /127.0.0.1:48200 dest: /127.0.0.1:38019
18:46:45.962 INFO clienttrace - src: /127.0.0.1:48200, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741894_1070, duration(ns): 347942
18:46:45.962 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741894_1070, type=LAST_IN_PIPELINE terminating
18:46:45.963 INFO FSNamesystem - BLOCK* blk_1073741894_1070 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi
18:46:46.364 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:46.365 INFO IndexFileMerger - Done merging .sbi files
18:46:46.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.parts dst=null perm=null proto=rpc
18:46:46.374 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi dst=null perm=null proto=rpc
18:46:46.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi dst=null perm=null proto=rpc
18:46:46.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi dst=null perm=null proto=rpc
18:46:46.376 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:46.376 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.378 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.bai dst=null perm=null proto=rpc
18:46:46.378 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bai dst=null perm=null proto=rpc
18:46:46.379 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:46.380 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:46.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi dst=null perm=null proto=rpc
18:46:46.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi dst=null perm=null proto=rpc
18:46:46.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.sbi dst=null perm=null proto=rpc
18:46:46.382 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:46.382 INFO MemoryStore - Block broadcast_465 stored as values in memory (estimated size 320.0 B, free 1916.8 MiB)
18:46:46.383 INFO MemoryStore - Block broadcast_465_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.8 MiB)
18:46:46.383 INFO BlockManagerInfo - Added broadcast_465_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.4 MiB)
18:46:46.383 INFO SparkContext - Created broadcast 465 from broadcast at BamSource.java:104
18:46:46.384 INFO MemoryStore - Block broadcast_466 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
18:46:46.390 INFO MemoryStore - Block broadcast_466_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:46.390 INFO BlockManagerInfo - Added broadcast_466_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.4 MiB)
18:46:46.390 INFO SparkContext - Created broadcast 466 from newAPIHadoopFile at PathSplitSource.java:96
18:46:46.399 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.399 INFO FileInputFormat - Total input files to process : 1
18:46:46.399 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.413 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:46.413 INFO DAGScheduler - Got job 174 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:46.413 INFO DAGScheduler - Final stage: ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:46.413 INFO DAGScheduler - Parents of final stage: List()
18:46:46.413 INFO DAGScheduler - Missing parents: List()
18:46:46.413 INFO DAGScheduler - Submitting ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:46.419 INFO MemoryStore - Block broadcast_467 stored as values in memory (estimated size 148.2 KiB, free 1916.3 MiB)
18:46:46.420 INFO MemoryStore - Block broadcast_467_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.2 MiB)
18:46:46.420 INFO BlockManagerInfo - Added broadcast_467_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.3 MiB)
18:46:46.420 INFO SparkContext - Created broadcast 467 from broadcast at DAGScheduler.scala:1580
18:46:46.420 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:46.420 INFO TaskSchedulerImpl - Adding task set 232.0 with 1 tasks resource profile 0
18:46:46.421 INFO TaskSetManager - Starting task 0.0 in stage 232.0 (TID 288) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:46.421 INFO Executor - Running task 0.0 in stage 232.0 (TID 288)
18:46:46.432 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam:0+237038
18:46:46.433 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.433 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.434 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.bai dst=null perm=null proto=rpc
18:46:46.434 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bai dst=null perm=null proto=rpc
18:46:46.435 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:46.437 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:46.437 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:46.439 INFO Executor - Finished task 0.0 in stage 232.0 (TID 288). 651483 bytes result sent to driver
18:46:46.441 INFO TaskSetManager - Finished task 0.0 in stage 232.0 (TID 288) in 20 ms on localhost (executor driver) (1/1)
18:46:46.441 INFO TaskSchedulerImpl - Removed TaskSet 232.0, whose tasks have all completed, from pool
18:46:46.441 INFO DAGScheduler - ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
18:46:46.442 INFO DAGScheduler - Job 174 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:46.442 INFO TaskSchedulerImpl - Killing all running tasks in stage 232: Stage finished
18:46:46.442 INFO DAGScheduler - Job 174 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.028456 s
18:46:46.451 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:46.451 INFO DAGScheduler - Got job 175 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:46.451 INFO DAGScheduler - Final stage: ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185)
18:46:46.451 INFO DAGScheduler - Parents of final stage: List()
18:46:46.451 INFO DAGScheduler - Missing parents: List()
18:46:46.451 INFO DAGScheduler - Submitting ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:46.468 INFO MemoryStore - Block broadcast_468 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
18:46:46.469 INFO MemoryStore - Block broadcast_468_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
18:46:46.469 INFO BlockManagerInfo - Added broadcast_468_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.2 MiB)
18:46:46.469 INFO SparkContext - Created broadcast 468 from broadcast at DAGScheduler.scala:1580
18:46:46.470 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:46.470 INFO TaskSchedulerImpl - Adding task set 233.0 with 1 tasks resource profile 0
18:46:46.470 INFO TaskSetManager - Starting task 0.0 in stage 233.0 (TID 289) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:46.470 INFO Executor - Running task 0.0 in stage 233.0 (TID 289)
18:46:46.499 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:46.508 INFO Executor - Finished task 0.0 in stage 233.0 (TID 289). 989 bytes result sent to driver
18:46:46.509 INFO TaskSetManager - Finished task 0.0 in stage 233.0 (TID 289) in 39 ms on localhost (executor driver) (1/1)
18:46:46.509 INFO TaskSchedulerImpl - Removed TaskSet 233.0, whose tasks have all completed, from pool
18:46:46.509 INFO DAGScheduler - ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
18:46:46.509 INFO DAGScheduler - Job 175 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:46.509 INFO TaskSchedulerImpl - Killing all running tasks in stage 233: Stage finished
18:46:46.509 INFO DAGScheduler - Job 175 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058418 s
18:46:46.512 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:46.513 INFO DAGScheduler - Got job 176 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:46.513 INFO DAGScheduler - Final stage: ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185)
18:46:46.513 INFO DAGScheduler - Parents of final stage: List()
18:46:46.513 INFO DAGScheduler - Missing parents: List()
18:46:46.513 INFO DAGScheduler - Submitting ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:46.519 INFO MemoryStore - Block broadcast_469 stored as values in memory (estimated size 148.1 KiB, free 1915.5 MiB)
18:46:46.520 INFO MemoryStore - Block broadcast_469_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.5 MiB)
18:46:46.520 INFO BlockManagerInfo - Added broadcast_469_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.1 MiB)
18:46:46.520 INFO SparkContext - Created broadcast 469 from broadcast at DAGScheduler.scala:1580
18:46:46.520 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:46.520 INFO TaskSchedulerImpl - Adding task set 234.0 with 1 tasks resource profile 0
18:46:46.521 INFO TaskSetManager - Starting task 0.0 in stage 234.0 (TID 290) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:46.521 INFO Executor - Running task 0.0 in stage 234.0 (TID 290)
18:46:46.531 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam:0+237038
18:46:46.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam dst=null perm=null proto=rpc
18:46:46.533 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bam.bai dst=null perm=null proto=rpc
18:46:46.533 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a273ccf4-befc-4be4-9099-96b2d242b383.bai dst=null perm=null proto=rpc
18:46:46.534 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:46.537 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:46.537 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:46.539 INFO Executor - Finished task 0.0 in stage 234.0 (TID 290). 989 bytes result sent to driver
18:46:46.539 INFO TaskSetManager - Finished task 0.0 in stage 234.0 (TID 290) in 19 ms on localhost (executor driver) (1/1)
18:46:46.539 INFO TaskSchedulerImpl - Removed TaskSet 234.0, whose tasks have all completed, from pool
18:46:46.539 INFO DAGScheduler - ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
18:46:46.539 INFO DAGScheduler - Job 176 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:46.539 INFO TaskSchedulerImpl - Killing all running tasks in stage 234: Stage finished
18:46:46.539 INFO DAGScheduler - Job 176 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026836 s
18:46:46.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:46.548 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:46.549 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:46.549 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:46.551 INFO MemoryStore - Block broadcast_470 stored as values in memory (estimated size 297.9 KiB, free 1915.2 MiB)
18:46:46.557 INFO MemoryStore - Block broadcast_470_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.1 MiB)
18:46:46.557 INFO BlockManagerInfo - Added broadcast_470_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.1 MiB)
18:46:46.558 INFO SparkContext - Created broadcast 470 from newAPIHadoopFile at PathSplitSource.java:96
18:46:46.579 INFO MemoryStore - Block broadcast_471 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
18:46:46.585 INFO MemoryStore - Block broadcast_471_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1914.8 MiB)
18:46:46.585 INFO BlockManagerInfo - Added broadcast_471_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.0 MiB)
18:46:46.585 INFO SparkContext - Created broadcast 471 from newAPIHadoopFile at PathSplitSource.java:96
18:46:46.604 INFO FileInputFormat - Total input files to process : 1
18:46:46.606 INFO MemoryStore - Block broadcast_472 stored as values in memory (estimated size 160.7 KiB, free 1914.6 MiB)
18:46:46.610 INFO BlockManagerInfo - Removed broadcast_455_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:46.610 INFO MemoryStore - Block broadcast_472_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.0 MiB)
18:46:46.611 INFO BlockManagerInfo - Added broadcast_472_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.0 MiB)
18:46:46.611 INFO BlockManagerInfo - Removed broadcast_471_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:46.611 INFO SparkContext - Created broadcast 472 from broadcast at ReadsSparkSink.java:133
18:46:46.611 INFO BlockManagerInfo - Removed broadcast_466_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:46.611 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:46.611 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
18:46:46.611 INFO BlockManagerInfo - Removed broadcast_458_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:46.612 INFO BlockManagerInfo - Removed broadcast_463_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.5 MiB)
18:46:46.612 INFO BlockManagerInfo - Removed broadcast_467_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.5 MiB)
18:46:46.612 INFO BlockManagerInfo - Removed broadcast_465_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.5 MiB)
18:46:46.612 INFO MemoryStore - Block broadcast_473 stored as values in memory (estimated size 163.2 KiB, free 1916.9 MiB)
18:46:46.613 INFO BlockManagerInfo - Removed broadcast_461_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.5 MiB)
18:46:46.613 INFO MemoryStore - Block broadcast_473_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.1 MiB)
18:46:46.613 INFO BlockManagerInfo - Added broadcast_473_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:46.613 INFO SparkContext - Created broadcast 473 from broadcast at BamSink.java:76
18:46:46.614 INFO BlockManagerInfo - Removed broadcast_468_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:46.614 INFO BlockManagerInfo - Removed broadcast_464_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.7 MiB)
18:46:46.614 INFO BlockManagerInfo - Removed broadcast_462_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:46.615 INFO BlockManagerInfo - Removed broadcast_449_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:46.615 INFO BlockManagerInfo - Removed broadcast_460_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:46.616 INFO BlockManagerInfo - Removed broadcast_469_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.9 MiB)
18:46:46.616 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts dst=null perm=null proto=rpc
18:46:46.616 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:46.616 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:46.616 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:46.616 INFO BlockManagerInfo - Removed broadcast_459_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.9 MiB)
18:46:46.617 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:46.625 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:46.625 INFO DAGScheduler - Registering RDD 1129 (mapToPair at SparkUtils.java:161) as input to shuffle 47
18:46:46.626 INFO DAGScheduler - Got job 177 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:46.626 INFO DAGScheduler - Final stage: ResultStage 236 (runJob at SparkHadoopWriter.scala:83)
18:46:46.626 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 235)
18:46:46.626 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 235)
18:46:46.626 INFO DAGScheduler - Submitting ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:46.643 INFO MemoryStore - Block broadcast_474 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
18:46:46.644 INFO MemoryStore - Block broadcast_474_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
18:46:46.645 INFO BlockManagerInfo - Added broadcast_474_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.8 MiB)
18:46:46.645 INFO SparkContext - Created broadcast 474 from broadcast at DAGScheduler.scala:1580
18:46:46.645 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:46.645 INFO TaskSchedulerImpl - Adding task set 235.0 with 1 tasks resource profile 0
18:46:46.646 INFO TaskSetManager - Starting task 0.0 in stage 235.0 (TID 291) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:46.646 INFO Executor - Running task 0.0 in stage 235.0 (TID 291)
18:46:46.676 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:46.690 INFO Executor - Finished task 0.0 in stage 235.0 (TID 291). 1148 bytes result sent to driver
18:46:46.691 INFO TaskSetManager - Finished task 0.0 in stage 235.0 (TID 291) in 46 ms on localhost (executor driver) (1/1)
18:46:46.691 INFO TaskSchedulerImpl - Removed TaskSet 235.0, whose tasks have all completed, from pool
18:46:46.691 INFO DAGScheduler - ShuffleMapStage 235 (mapToPair at SparkUtils.java:161) finished in 0.065 s
18:46:46.691 INFO DAGScheduler - looking for newly runnable stages
18:46:46.691 INFO DAGScheduler - running: HashSet()
18:46:46.691 INFO DAGScheduler - waiting: HashSet(ResultStage 236)
18:46:46.691 INFO DAGScheduler - failed: HashSet()
18:46:46.691 INFO DAGScheduler - Submitting ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91), which has no missing parents
18:46:46.698 INFO MemoryStore - Block broadcast_475 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
18:46:46.699 INFO MemoryStore - Block broadcast_475_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
18:46:46.699 INFO BlockManagerInfo - Added broadcast_475_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.7 MiB)
18:46:46.699 INFO SparkContext - Created broadcast 475 from broadcast at DAGScheduler.scala:1580
18:46:46.699 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:46.699 INFO TaskSchedulerImpl - Adding task set 236.0 with 1 tasks resource profile 0
18:46:46.700 INFO TaskSetManager - Starting task 0.0 in stage 236.0 (TID 292) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:46.700 INFO Executor - Running task 0.0 in stage 236.0 (TID 292)
18:46:46.704 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:46.704 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:46.715 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:46.715 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:46.715 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:46.715 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:46.715 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:46.715 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:46.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:46.718 INFO StateChange - BLOCK* allocate blk_1073741895_1071, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0/part-r-00000
18:46:46.719 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741895_1071 src: /127.0.0.1:48208 dest: /127.0.0.1:38019
18:46:46.720 INFO clienttrace - src: /127.0.0.1:48208, dest: /127.0.0.1:38019, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741895_1071, duration(ns): 1053203
18:46:46.721 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741895_1071, type=LAST_IN_PIPELINE terminating
18:46:46.721 INFO FSNamesystem - BLOCK* blk_1073741895_1071 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0/part-r-00000
18:46:47.122 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:47.123 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:47.124 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0 dst=null perm=null proto=rpc
18:46:47.124 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0 dst=null perm=null proto=rpc
18:46:47.125 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/task_202505191846468575769758718573508_1134_r_000000 dst=null perm=null proto=rpc
18:46:47.125 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/_temporary/attempt_202505191846468575769758718573508_1134_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/task_202505191846468575769758718573508_1134_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:47.125 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846468575769758718573508_1134_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/task_202505191846468575769758718573508_1134_r_000000
18:46:47.125 INFO SparkHadoopMapRedUtil - attempt_202505191846468575769758718573508_1134_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:47.126 INFO Executor - Finished task 0.0 in stage 236.0 (TID 292). 1858 bytes result sent to driver
18:46:47.126 INFO TaskSetManager - Finished task 0.0 in stage 236.0 (TID 292) in 427 ms on localhost (executor driver) (1/1)
18:46:47.126 INFO TaskSchedulerImpl - Removed TaskSet 236.0, whose tasks have all completed, from pool
18:46:47.127 INFO DAGScheduler - ResultStage 236 (runJob at SparkHadoopWriter.scala:83) finished in 0.434 s
18:46:47.127 INFO DAGScheduler - Job 177 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:47.127 INFO TaskSchedulerImpl - Killing all running tasks in stage 236: Stage finished
18:46:47.127 INFO DAGScheduler - Job 177 finished: runJob at SparkHadoopWriter.scala:83, took 0.501604 s
18:46:47.127 INFO SparkHadoopWriter - Start to commit write Job job_202505191846468575769758718573508_1134.
18:46:47.127 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:47.128 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts dst=null perm=null proto=rpc
18:46:47.128 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/task_202505191846468575769758718573508_1134_r_000000 dst=null perm=null proto=rpc
18:46:47.129 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:47.129 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary/0/task_202505191846468575769758718573508_1134_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.130 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:47.130 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.132 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:47.133 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/.spark-staging-1134 dst=null perm=null proto=rpc
18:46:47.133 INFO SparkHadoopWriter - Write Job job_202505191846468575769758718573508_1134 committed. Elapsed time: 5 ms.
18:46:47.133 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.134 INFO StateChange - BLOCK* allocate blk_1073741896_1072, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/header
18:46:47.135 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741896_1072 src: /127.0.0.1:48216 dest: /127.0.0.1:38019
18:46:47.136 INFO clienttrace - src: /127.0.0.1:48216, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741896_1072, duration(ns): 435777
18:46:47.136 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741896_1072, type=LAST_IN_PIPELINE terminating
18:46:47.137 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:47.138 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.139 INFO StateChange - BLOCK* allocate blk_1073741897_1073, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/terminator
18:46:47.139 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741897_1073 src: /127.0.0.1:48220 dest: /127.0.0.1:38019
18:46:47.140 INFO clienttrace - src: /127.0.0.1:48220, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741897_1073, duration(ns): 374819
18:46:47.140 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741897_1073, type=LAST_IN_PIPELINE terminating
18:46:47.141 INFO FSNamesystem - BLOCK* blk_1073741897_1073 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/terminator
18:46:47.542 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:47.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts dst=null perm=null proto=rpc
18:46:47.543 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.544 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:47.545 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam
18:46:47.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.547 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam done
18:46:47.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.parts dst=null perm=null proto=rpc
18:46:47.548 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.548 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.549 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.549 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.550 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.bai dst=null perm=null proto=rpc
18:46:47.550 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bai dst=null perm=null proto=rpc
18:46:47.552 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.553 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.553 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.sbi dst=null perm=null proto=rpc
18:46:47.554 INFO MemoryStore - Block broadcast_476 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:47.560 INFO MemoryStore - Block broadcast_476_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:47.560 INFO BlockManagerInfo - Added broadcast_476_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:47.561 INFO SparkContext - Created broadcast 476 from newAPIHadoopFile at PathSplitSource.java:96
18:46:47.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.581 INFO FileInputFormat - Total input files to process : 1
18:46:47.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.616 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:47.617 INFO DAGScheduler - Got job 178 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:47.617 INFO DAGScheduler - Final stage: ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:47.617 INFO DAGScheduler - Parents of final stage: List()
18:46:47.617 INFO DAGScheduler - Missing parents: List()
18:46:47.617 INFO DAGScheduler - Submitting ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:47.633 INFO MemoryStore - Block broadcast_477 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
18:46:47.635 INFO MemoryStore - Block broadcast_477_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
18:46:47.635 INFO BlockManagerInfo - Added broadcast_477_piece0 in memory on localhost:45727 (size: 153.7 KiB, free: 1919.5 MiB)
18:46:47.635 INFO SparkContext - Created broadcast 477 from broadcast at DAGScheduler.scala:1580
18:46:47.635 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:47.635 INFO TaskSchedulerImpl - Adding task set 237.0 with 1 tasks resource profile 0
18:46:47.636 INFO TaskSetManager - Starting task 0.0 in stage 237.0 (TID 293) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:47.636 INFO Executor - Running task 0.0 in stage 237.0 (TID 293)
18:46:47.665 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam:0+237038
18:46:47.665 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.666 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.667 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.668 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.668 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.bai dst=null perm=null proto=rpc
18:46:47.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bai dst=null perm=null proto=rpc
18:46:47.670 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.672 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.673 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.677 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.678 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.679 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.679 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.681 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.681 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.682 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.682 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.683 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.684 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.684 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.685 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.686 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.686 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.687 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.687 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.688 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.689 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.690 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.690 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.691 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.692 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.692 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.693 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.693 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.694 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.695 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.696 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.696 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.698 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.698 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.699 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.699 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.700 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.701 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.702 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.702 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.703 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.703 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.704 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.705 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.705 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.706 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.707 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.708 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.708 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.709 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.710 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.711 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.711 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.712 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.713 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.714 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.714 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.715 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.716 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.716 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.717 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.717 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.718 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.719 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.719 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.720 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.720 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.721 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.bai dst=null perm=null proto=rpc
18:46:47.721 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bai dst=null perm=null proto=rpc
18:46:47.722 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.724 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.725 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:47.727 INFO Executor - Finished task 0.0 in stage 237.0 (TID 293). 651483 bytes result sent to driver
18:46:47.728 INFO TaskSetManager - Finished task 0.0 in stage 237.0 (TID 293) in 92 ms on localhost (executor driver) (1/1)
18:46:47.728 INFO TaskSchedulerImpl - Removed TaskSet 237.0, whose tasks have all completed, from pool
18:46:47.728 INFO DAGScheduler - ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.111 s
18:46:47.728 INFO DAGScheduler - Job 178 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:47.728 INFO TaskSchedulerImpl - Killing all running tasks in stage 237: Stage finished
18:46:47.729 INFO DAGScheduler - Job 178 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.112202 s
18:46:47.739 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:47.739 INFO DAGScheduler - Got job 179 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:47.739 INFO DAGScheduler - Final stage: ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185)
18:46:47.739 INFO DAGScheduler - Parents of final stage: List()
18:46:47.739 INFO DAGScheduler - Missing parents: List()
18:46:47.739 INFO DAGScheduler - Submitting ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:47.762 INFO MemoryStore - Block broadcast_478 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
18:46:47.763 INFO MemoryStore - Block broadcast_478_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
18:46:47.763 INFO BlockManagerInfo - Added broadcast_478_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.4 MiB)
18:46:47.763 INFO SparkContext - Created broadcast 478 from broadcast at DAGScheduler.scala:1580
18:46:47.763 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:47.763 INFO TaskSchedulerImpl - Adding task set 238.0 with 1 tasks resource profile 0
18:46:47.764 INFO TaskSetManager - Starting task 0.0 in stage 238.0 (TID 294) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:47.764 INFO Executor - Running task 0.0 in stage 238.0 (TID 294)
18:46:47.793 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:47.802 INFO Executor - Finished task 0.0 in stage 238.0 (TID 294). 989 bytes result sent to driver
18:46:47.802 INFO TaskSetManager - Finished task 0.0 in stage 238.0 (TID 294) in 38 ms on localhost (executor driver) (1/1)
18:46:47.802 INFO TaskSchedulerImpl - Removed TaskSet 238.0, whose tasks have all completed, from pool
18:46:47.802 INFO DAGScheduler - ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
18:46:47.802 INFO DAGScheduler - Job 179 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:47.803 INFO TaskSchedulerImpl - Killing all running tasks in stage 238: Stage finished
18:46:47.803 INFO DAGScheduler - Job 179 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063876 s
18:46:47.806 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:47.806 INFO DAGScheduler - Got job 180 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:47.806 INFO DAGScheduler - Final stage: ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185)
18:46:47.806 INFO DAGScheduler - Parents of final stage: List()
18:46:47.806 INFO DAGScheduler - Missing parents: List()
18:46:47.806 INFO DAGScheduler - Submitting ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:47.823 INFO MemoryStore - Block broadcast_479 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
18:46:47.824 INFO MemoryStore - Block broadcast_479_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
18:46:47.824 INFO BlockManagerInfo - Added broadcast_479_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.2 MiB)
18:46:47.824 INFO SparkContext - Created broadcast 479 from broadcast at DAGScheduler.scala:1580
18:46:47.824 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:47.824 INFO TaskSchedulerImpl - Adding task set 239.0 with 1 tasks resource profile 0
18:46:47.825 INFO TaskSetManager - Starting task 0.0 in stage 239.0 (TID 295) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:47.825 INFO Executor - Running task 0.0 in stage 239.0 (TID 295)
18:46:47.854 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam:0+237038
18:46:47.855 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.855 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.856 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.856 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.857 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.857 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.bai dst=null perm=null proto=rpc
18:46:47.858 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bai dst=null perm=null proto=rpc
18:46:47.859 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.860 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.860 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.861 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.861 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.865 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.866 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.866 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.867 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.868 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.868 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.869 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.869 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.870 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.870 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.871 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.871 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.872 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.872 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.873 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.874 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.875 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.875 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.876 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.876 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.877 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.878 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.878 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.879 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.880 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.881 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.881 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.883 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.883 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.884 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.884 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.885 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.886 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.886 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.887 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.887 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.888 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.889 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.889 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.890 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.891 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.892 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.893 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.894 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.894 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.895 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.896 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.896 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.898 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.898 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.899 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.900 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.901 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.901 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.903 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.904 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.904 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.905 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.906 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.907 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.908 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
18:46:47.909 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.909 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.909 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.910 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam dst=null perm=null proto=rpc
18:46:47.911 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bam.bai dst=null perm=null proto=rpc
18:46:47.911 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_cf7a7dd1-9d3a-4c92-a7ab-d635b6994303.bai dst=null perm=null proto=rpc
18:46:47.913 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:47.915 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
18:46:47.915 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:47.917 INFO Executor - Finished task 0.0 in stage 239.0 (TID 295). 989 bytes result sent to driver
18:46:47.918 INFO TaskSetManager - Finished task 0.0 in stage 239.0 (TID 295) in 93 ms on localhost (executor driver) (1/1)
18:46:47.918 INFO TaskSchedulerImpl - Removed TaskSet 239.0, whose tasks have all completed, from pool
18:46:47.918 INFO DAGScheduler - ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.112 s
18:46:47.918 INFO DAGScheduler - Job 180 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:47.918 INFO TaskSchedulerImpl - Killing all running tasks in stage 239: Stage finished
18:46:47.918 INFO DAGScheduler - Job 180 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.112218 s
18:46:47.927 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:47.927 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:47.928 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:47.929 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:47.931 INFO MemoryStore - Block broadcast_480 stored as values in memory (estimated size 298.0 KiB, free 1916.0 MiB)
18:46:47.937 INFO MemoryStore - Block broadcast_480_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.0 MiB)
18:46:47.937 INFO BlockManagerInfo - Added broadcast_480_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.2 MiB)
18:46:47.937 INFO SparkContext - Created broadcast 480 from newAPIHadoopFile at PathSplitSource.java:96
18:46:47.958 INFO MemoryStore - Block broadcast_481 stored as values in memory (estimated size 298.0 KiB, free 1915.7 MiB)
18:46:47.964 INFO MemoryStore - Block broadcast_481_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1915.6 MiB)
18:46:47.964 INFO BlockManagerInfo - Added broadcast_481_piece0 in memory on localhost:45727 (size: 50.3 KiB, free: 1919.1 MiB)
18:46:47.964 INFO SparkContext - Created broadcast 481 from newAPIHadoopFile at PathSplitSource.java:96
18:46:47.984 INFO FileInputFormat - Total input files to process : 1
18:46:47.985 INFO MemoryStore - Block broadcast_482 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
18:46:47.986 INFO MemoryStore - Block broadcast_482_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
18:46:47.986 INFO BlockManagerInfo - Added broadcast_482_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.1 MiB)
18:46:47.986 INFO SparkContext - Created broadcast 482 from broadcast at ReadsSparkSink.java:133
18:46:47.987 INFO MemoryStore - Block broadcast_483 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
18:46:47.988 INFO MemoryStore - Block broadcast_483_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
18:46:47.988 INFO BlockManagerInfo - Added broadcast_483_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.1 MiB)
18:46:47.988 INFO SparkContext - Created broadcast 483 from broadcast at BamSink.java:76
18:46:47.990 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts dst=null perm=null proto=rpc
18:46:47.991 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:47.991 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:47.991 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:47.991 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:47.997 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:47.997 INFO DAGScheduler - Registering RDD 1155 (mapToPair at SparkUtils.java:161) as input to shuffle 48
18:46:47.998 INFO DAGScheduler - Got job 181 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:47.998 INFO DAGScheduler - Final stage: ResultStage 241 (runJob at SparkHadoopWriter.scala:83)
18:46:47.998 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 240)
18:46:47.998 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 240)
18:46:47.998 INFO DAGScheduler - Submitting ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:48.015 INFO MemoryStore - Block broadcast_484 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
18:46:48.020 INFO BlockManagerInfo - Removed broadcast_470_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.1 MiB)
18:46:48.020 INFO BlockManagerInfo - Removed broadcast_477_piece0 on localhost:45727 in memory (size: 153.7 KiB, free: 1919.3 MiB)
18:46:48.021 INFO BlockManagerInfo - Removed broadcast_475_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1919.4 MiB)
18:46:48.021 INFO MemoryStore - Block broadcast_484_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.6 MiB)
18:46:48.021 INFO BlockManagerInfo - Added broadcast_484_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.2 MiB)
18:46:48.021 INFO BlockManagerInfo - Removed broadcast_479_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.3 MiB)
18:46:48.021 INFO SparkContext - Created broadcast 484 from broadcast at DAGScheduler.scala:1580
18:46:48.021 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:48.021 INFO TaskSchedulerImpl - Adding task set 240.0 with 1 tasks resource profile 0
18:46:48.022 INFO BlockManagerInfo - Removed broadcast_474_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.5 MiB)
18:46:48.022 INFO TaskSetManager - Starting task 0.0 in stage 240.0 (TID 296) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
18:46:48.022 INFO Executor - Running task 0.0 in stage 240.0 (TID 296)
18:46:48.022 INFO BlockManagerInfo - Removed broadcast_478_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:48.023 INFO BlockManagerInfo - Removed broadcast_476_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.7 MiB)
18:46:48.023 INFO BlockManagerInfo - Removed broadcast_473_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:48.024 INFO BlockManagerInfo - Removed broadcast_481_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.8 MiB)
18:46:48.025 INFO BlockManagerInfo - Removed broadcast_472_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.8 MiB)
18:46:48.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741891_1067 replica FinalizedReplica, blk_1073741891_1067, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741891 for deletion
18:46:48.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741891_1067 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741891
18:46:48.054 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:48.070 INFO Executor - Finished task 0.0 in stage 240.0 (TID 296). 1148 bytes result sent to driver
18:46:48.070 INFO TaskSetManager - Finished task 0.0 in stage 240.0 (TID 296) in 48 ms on localhost (executor driver) (1/1)
18:46:48.070 INFO TaskSchedulerImpl - Removed TaskSet 240.0, whose tasks have all completed, from pool
18:46:48.070 INFO DAGScheduler - ShuffleMapStage 240 (mapToPair at SparkUtils.java:161) finished in 0.072 s
18:46:48.070 INFO DAGScheduler - looking for newly runnable stages
18:46:48.070 INFO DAGScheduler - running: HashSet()
18:46:48.070 INFO DAGScheduler - waiting: HashSet(ResultStage 241)
18:46:48.070 INFO DAGScheduler - failed: HashSet()
18:46:48.070 INFO DAGScheduler - Submitting ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91), which has no missing parents
18:46:48.077 INFO MemoryStore - Block broadcast_485 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
18:46:48.078 INFO MemoryStore - Block broadcast_485_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
18:46:48.078 INFO BlockManagerInfo - Added broadcast_485_piece0 in memory on localhost:45727 (size: 67.1 KiB, free: 1919.7 MiB)
18:46:48.078 INFO SparkContext - Created broadcast 485 from broadcast at DAGScheduler.scala:1580
18:46:48.078 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:48.078 INFO TaskSchedulerImpl - Adding task set 241.0 with 1 tasks resource profile 0
18:46:48.079 INFO TaskSetManager - Starting task 0.0 in stage 241.0 (TID 297) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:48.079 INFO Executor - Running task 0.0 in stage 241.0 (TID 297)
18:46:48.085 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:48.085 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:48.096 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:48.096 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:48.096 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:48.096 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:48.096 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:48.096 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:48.097 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.098 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.099 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.101 INFO StateChange - BLOCK* allocate blk_1073741898_1074, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/part-r-00000
18:46:48.102 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741898_1074 src: /127.0.0.1:58646 dest: /127.0.0.1:38019
18:46:48.104 INFO clienttrace - src: /127.0.0.1:58646, dest: /127.0.0.1:38019, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741898_1074, duration(ns): 875482
18:46:48.104 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741898_1074, type=LAST_IN_PIPELINE terminating
18:46:48.104 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:48.105 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:48.105 INFO StateChange - BLOCK* allocate blk_1073741899_1075, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/.part-r-00000.sbi
18:46:48.106 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741899_1075 src: /127.0.0.1:58654 dest: /127.0.0.1:38019
18:46:48.107 INFO clienttrace - src: /127.0.0.1:58654, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741899_1075, duration(ns): 345537
18:46:48.107 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741899_1075, type=LAST_IN_PIPELINE terminating
18:46:48.108 INFO FSNamesystem - BLOCK* blk_1073741899_1075 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/.part-r-00000.sbi
18:46:48.508 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:48.511 INFO StateChange - BLOCK* allocate blk_1073741900_1076, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/.part-r-00000.bai
18:46:48.511 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741900_1076 src: /127.0.0.1:58664 dest: /127.0.0.1:38019
18:46:48.512 INFO clienttrace - src: /127.0.0.1:58664, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741900_1076, duration(ns): 473712
18:46:48.513 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741900_1076, type=LAST_IN_PIPELINE terminating
18:46:48.513 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:48.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0 dst=null perm=null proto=rpc
18:46:48.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0 dst=null perm=null proto=rpc
18:46:48.515 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/task_202505191846474670623425386933122_1160_r_000000 dst=null perm=null proto=rpc
18:46:48.515 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/_temporary/attempt_202505191846474670623425386933122_1160_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/task_202505191846474670623425386933122_1160_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:48.515 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846474670623425386933122_1160_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/task_202505191846474670623425386933122_1160_r_000000
18:46:48.515 INFO SparkHadoopMapRedUtil - attempt_202505191846474670623425386933122_1160_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:48.516 INFO Executor - Finished task 0.0 in stage 241.0 (TID 297). 1858 bytes result sent to driver
18:46:48.516 INFO TaskSetManager - Finished task 0.0 in stage 241.0 (TID 297) in 438 ms on localhost (executor driver) (1/1)
18:46:48.516 INFO TaskSchedulerImpl - Removed TaskSet 241.0, whose tasks have all completed, from pool
18:46:48.516 INFO DAGScheduler - ResultStage 241 (runJob at SparkHadoopWriter.scala:83) finished in 0.445 s
18:46:48.516 INFO DAGScheduler - Job 181 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:48.516 INFO TaskSchedulerImpl - Killing all running tasks in stage 241: Stage finished
18:46:48.516 INFO DAGScheduler - Job 181 finished: runJob at SparkHadoopWriter.scala:83, took 0.519360 s
18:46:48.517 INFO SparkHadoopWriter - Start to commit write Job job_202505191846474670623425386933122_1160.
18:46:48.517 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:48.517 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts dst=null perm=null proto=rpc
18:46:48.518 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/task_202505191846474670623425386933122_1160_r_000000 dst=null perm=null proto=rpc
18:46:48.518 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:48.519 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/task_202505191846474670623425386933122_1160_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.519 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:48.519 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/task_202505191846474670623425386933122_1160_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:48.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary/0/task_202505191846474670623425386933122_1160_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.521 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:48.521 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.522 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:48.522 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.spark-staging-1160 dst=null perm=null proto=rpc
18:46:48.522 INFO SparkHadoopWriter - Write Job job_202505191846474670623425386933122_1160 committed. Elapsed time: 5 ms.
18:46:48.523 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.524 INFO StateChange - BLOCK* allocate blk_1073741901_1077, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/header
18:46:48.525 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741901_1077 src: /127.0.0.1:58668 dest: /127.0.0.1:38019
18:46:48.526 INFO clienttrace - src: /127.0.0.1:58668, dest: /127.0.0.1:38019, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741901_1077, duration(ns): 413676
18:46:48.526 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741901_1077, type=LAST_IN_PIPELINE terminating
18:46:48.526 INFO FSNamesystem - BLOCK* blk_1073741901_1077 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/header
18:46:48.927 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:48.928 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:48.929 INFO StateChange - BLOCK* allocate blk_1073741902_1078, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/terminator
18:46:48.930 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741902_1078 src: /127.0.0.1:58674 dest: /127.0.0.1:38019
18:46:48.931 INFO clienttrace - src: /127.0.0.1:58674, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741902_1078, duration(ns): 353981
18:46:48.931 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741902_1078, type=LAST_IN_PIPELINE terminating
18:46:48.931 INFO FSNamesystem - BLOCK* blk_1073741902_1078 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/terminator
18:46:49.332 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:49.333 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts dst=null perm=null proto=rpc
18:46:49.334 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:49.334 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:49.334 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam
18:46:49.335 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:49.335 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:49.336 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:49.336 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:49.336 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam done
18:46:49.337 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:49.337 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi
18:46:49.337 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts dst=null perm=null proto=rpc
18:46:49.338 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:49.338 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:49.339 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:49.340 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:49.340 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:49.341 INFO StateChange - BLOCK* allocate blk_1073741903_1079, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi
18:46:49.341 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741903_1079 src: /127.0.0.1:58684 dest: /127.0.0.1:38019
18:46:49.342 INFO clienttrace - src: /127.0.0.1:58684, dest: /127.0.0.1:38019, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741903_1079, duration(ns): 422630
18:46:49.342 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741903_1079, type=LAST_IN_PIPELINE terminating
18:46:49.343 INFO FSNamesystem - BLOCK* blk_1073741903_1079 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi
18:46:49.744 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:49.744 INFO IndexFileMerger - Done merging .sbi files
18:46:49.744 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai
18:46:49.744 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts dst=null perm=null proto=rpc
18:46:49.745 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:49.746 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:49.746 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:49.747 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:49.748 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:49.749 INFO StateChange - BLOCK* allocate blk_1073741904_1080, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai
18:46:49.749 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741904_1080 src: /127.0.0.1:58690 dest: /127.0.0.1:38019
18:46:49.750 INFO clienttrace - src: /127.0.0.1:58690, dest: /127.0.0.1:38019, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741904_1080, duration(ns): 378256
18:46:49.750 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741904_1080, type=LAST_IN_PIPELINE terminating
18:46:49.751 INFO FSNamesystem - BLOCK* blk_1073741904_1080 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai
18:46:50.151 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:50.152 INFO IndexFileMerger - Done merging .bai files
18:46:50.152 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.parts dst=null perm=null proto=rpc
18:46:50.161 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.169 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi dst=null perm=null proto=rpc
18:46:50.169 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi dst=null perm=null proto=rpc
18:46:50.170 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi dst=null perm=null proto=rpc
18:46:50.170 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:50.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.174 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:50.175 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:50.176 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:50.176 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
18:46:50.176 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi dst=null perm=null proto=rpc
18:46:50.176 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi dst=null perm=null proto=rpc
18:46:50.177 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.sbi dst=null perm=null proto=rpc
18:46:50.178 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
18:46:50.178 INFO MemoryStore - Block broadcast_486 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
18:46:50.178 INFO MemoryStore - Block broadcast_486_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
18:46:50.178 INFO BlockManagerInfo - Added broadcast_486_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.7 MiB)
18:46:50.179 INFO SparkContext - Created broadcast 486 from broadcast at BamSource.java:104
18:46:50.179 INFO MemoryStore - Block broadcast_487 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:50.185 INFO MemoryStore - Block broadcast_487_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:50.186 INFO BlockManagerInfo - Added broadcast_487_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:50.186 INFO SparkContext - Created broadcast 487 from newAPIHadoopFile at PathSplitSource.java:96
18:46:50.194 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.195 INFO FileInputFormat - Total input files to process : 1
18:46:50.195 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.209 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:50.210 INFO DAGScheduler - Got job 182 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:50.210 INFO DAGScheduler - Final stage: ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:50.210 INFO DAGScheduler - Parents of final stage: List()
18:46:50.210 INFO DAGScheduler - Missing parents: List()
18:46:50.210 INFO DAGScheduler - Submitting ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:50.216 INFO MemoryStore - Block broadcast_488 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
18:46:50.217 INFO MemoryStore - Block broadcast_488_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
18:46:50.217 INFO BlockManagerInfo - Added broadcast_488_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:50.217 INFO SparkContext - Created broadcast 488 from broadcast at DAGScheduler.scala:1580
18:46:50.217 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:50.217 INFO TaskSchedulerImpl - Adding task set 242.0 with 1 tasks resource profile 0
18:46:50.217 INFO TaskSetManager - Starting task 0.0 in stage 242.0 (TID 298) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:50.218 INFO Executor - Running task 0.0 in stage 242.0 (TID 298)
18:46:50.229 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam:0+235514
18:46:50.229 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.230 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.231 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.231 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.231 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.233 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:50.234 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:50.234 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:50.236 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:50.238 INFO Executor - Finished task 0.0 in stage 242.0 (TID 298). 650141 bytes result sent to driver
18:46:50.239 INFO TaskSetManager - Finished task 0.0 in stage 242.0 (TID 298) in 22 ms on localhost (executor driver) (1/1)
18:46:50.239 INFO TaskSchedulerImpl - Removed TaskSet 242.0, whose tasks have all completed, from pool
18:46:50.239 INFO DAGScheduler - ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
18:46:50.239 INFO DAGScheduler - Job 182 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:50.239 INFO TaskSchedulerImpl - Killing all running tasks in stage 242: Stage finished
18:46:50.239 INFO DAGScheduler - Job 182 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029998 s
18:46:50.249 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:50.249 INFO DAGScheduler - Got job 183 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:50.249 INFO DAGScheduler - Final stage: ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185)
18:46:50.249 INFO DAGScheduler - Parents of final stage: List()
18:46:50.249 INFO DAGScheduler - Missing parents: List()
18:46:50.249 INFO DAGScheduler - Submitting ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:50.266 INFO MemoryStore - Block broadcast_489 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
18:46:50.267 INFO MemoryStore - Block broadcast_489_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
18:46:50.267 INFO BlockManagerInfo - Added broadcast_489_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.5 MiB)
18:46:50.267 INFO SparkContext - Created broadcast 489 from broadcast at DAGScheduler.scala:1580
18:46:50.267 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:50.268 INFO TaskSchedulerImpl - Adding task set 243.0 with 1 tasks resource profile 0
18:46:50.268 INFO TaskSetManager - Starting task 0.0 in stage 243.0 (TID 299) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
18:46:50.268 INFO Executor - Running task 0.0 in stage 243.0 (TID 299)
18:46:50.297 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
18:46:50.307 INFO Executor - Finished task 0.0 in stage 243.0 (TID 299). 989 bytes result sent to driver
18:46:50.308 INFO TaskSetManager - Finished task 0.0 in stage 243.0 (TID 299) in 40 ms on localhost (executor driver) (1/1)
18:46:50.308 INFO TaskSchedulerImpl - Removed TaskSet 243.0, whose tasks have all completed, from pool
18:46:50.308 INFO DAGScheduler - ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
18:46:50.308 INFO DAGScheduler - Job 183 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:50.308 INFO TaskSchedulerImpl - Killing all running tasks in stage 243: Stage finished
18:46:50.308 INFO DAGScheduler - Job 183 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059377 s
18:46:50.311 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:50.311 INFO DAGScheduler - Got job 184 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:50.311 INFO DAGScheduler - Final stage: ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185)
18:46:50.311 INFO DAGScheduler - Parents of final stage: List()
18:46:50.312 INFO DAGScheduler - Missing parents: List()
18:46:50.312 INFO DAGScheduler - Submitting ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:50.318 INFO MemoryStore - Block broadcast_490 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
18:46:50.318 INFO MemoryStore - Block broadcast_490_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
18:46:50.318 INFO BlockManagerInfo - Added broadcast_490_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.4 MiB)
18:46:50.318 INFO SparkContext - Created broadcast 490 from broadcast at DAGScheduler.scala:1580
18:46:50.319 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:50.319 INFO TaskSchedulerImpl - Adding task set 244.0 with 1 tasks resource profile 0
18:46:50.319 INFO TaskSetManager - Starting task 0.0 in stage 244.0 (TID 300) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:50.319 INFO Executor - Running task 0.0 in stage 244.0 (TID 300)
18:46:50.330 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam:0+235514
18:46:50.330 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.331 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam dst=null perm=null proto=rpc
18:46:50.331 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.332 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.332 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_c38bfae6-d612-472f-8724-4410229f0bd8.bam.bai dst=null perm=null proto=rpc
18:46:50.333 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
18:46:50.335 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:50.335 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
18:46:50.336 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
18:46:50.336 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:50.338 INFO Executor - Finished task 0.0 in stage 244.0 (TID 300). 989 bytes result sent to driver
18:46:50.338 INFO TaskSetManager - Finished task 0.0 in stage 244.0 (TID 300) in 19 ms on localhost (executor driver) (1/1)
18:46:50.338 INFO TaskSchedulerImpl - Removed TaskSet 244.0, whose tasks have all completed, from pool
18:46:50.338 INFO DAGScheduler - ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
18:46:50.338 INFO DAGScheduler - Job 184 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:50.338 INFO TaskSchedulerImpl - Killing all running tasks in stage 244: Stage finished
18:46:50.338 INFO DAGScheduler - Job 184 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026853 s
18:46:50.346 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:50.347 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:50.348 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:50.348 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:50.351 INFO MemoryStore - Block broadcast_491 stored as values in memory (estimated size 298.0 KiB, free 1916.8 MiB)
18:46:50.361 INFO MemoryStore - Block broadcast_491_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
18:46:50.361 INFO BlockManagerInfo - Added broadcast_491_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:50.361 INFO SparkContext - Created broadcast 491 from newAPIHadoopFile at PathSplitSource.java:96
18:46:50.395 INFO MemoryStore - Block broadcast_492 stored as values in memory (estimated size 298.0 KiB, free 1916.4 MiB)
18:46:50.401 INFO MemoryStore - Block broadcast_492_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
18:46:50.401 INFO BlockManagerInfo - Added broadcast_492_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:50.401 INFO SparkContext - Created broadcast 492 from newAPIHadoopFile at PathSplitSource.java:96
18:46:50.420 INFO FileInputFormat - Total input files to process : 1
18:46:50.421 INFO MemoryStore - Block broadcast_493 stored as values in memory (estimated size 19.6 KiB, free 1916.4 MiB)
18:46:50.422 INFO MemoryStore - Block broadcast_493_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
18:46:50.422 INFO BlockManagerInfo - Added broadcast_493_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.3 MiB)
18:46:50.422 INFO SparkContext - Created broadcast 493 from broadcast at ReadsSparkSink.java:133
18:46:50.423 INFO MemoryStore - Block broadcast_494 stored as values in memory (estimated size 20.0 KiB, free 1916.3 MiB)
18:46:50.423 INFO MemoryStore - Block broadcast_494_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
18:46:50.423 INFO BlockManagerInfo - Added broadcast_494_piece0 in memory on localhost:45727 (size: 1890.0 B, free: 1919.3 MiB)
18:46:50.423 INFO SparkContext - Created broadcast 494 from broadcast at BamSink.java:76
18:46:50.425 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts dst=null perm=null proto=rpc
18:46:50.425 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:50.425 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:50.425 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:50.426 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:50.432 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:50.432 INFO DAGScheduler - Registering RDD 1180 (mapToPair at SparkUtils.java:161) as input to shuffle 49
18:46:50.432 INFO DAGScheduler - Got job 185 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:50.432 INFO DAGScheduler - Final stage: ResultStage 246 (runJob at SparkHadoopWriter.scala:83)
18:46:50.432 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 245)
18:46:50.432 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 245)
18:46:50.432 INFO DAGScheduler - Submitting ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:50.449 INFO MemoryStore - Block broadcast_495 stored as values in memory (estimated size 434.3 KiB, free 1915.9 MiB)
18:46:50.451 INFO MemoryStore - Block broadcast_495_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1915.8 MiB)
18:46:50.451 INFO BlockManagerInfo - Added broadcast_495_piece0 in memory on localhost:45727 (size: 157.6 KiB, free: 1919.1 MiB)
18:46:50.451 INFO SparkContext - Created broadcast 495 from broadcast at DAGScheduler.scala:1580
18:46:50.451 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:50.451 INFO TaskSchedulerImpl - Adding task set 245.0 with 1 tasks resource profile 0
18:46:50.452 INFO TaskSetManager - Starting task 0.0 in stage 245.0 (TID 301) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
18:46:50.452 INFO Executor - Running task 0.0 in stage 245.0 (TID 301)
18:46:50.482 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:50.495 INFO Executor - Finished task 0.0 in stage 245.0 (TID 301). 1148 bytes result sent to driver
18:46:50.495 INFO TaskSetManager - Finished task 0.0 in stage 245.0 (TID 301) in 43 ms on localhost (executor driver) (1/1)
18:46:50.495 INFO TaskSchedulerImpl - Removed TaskSet 245.0, whose tasks have all completed, from pool
18:46:50.495 INFO DAGScheduler - ShuffleMapStage 245 (mapToPair at SparkUtils.java:161) finished in 0.062 s
18:46:50.495 INFO DAGScheduler - looking for newly runnable stages
18:46:50.495 INFO DAGScheduler - running: HashSet()
18:46:50.495 INFO DAGScheduler - waiting: HashSet(ResultStage 246)
18:46:50.495 INFO DAGScheduler - failed: HashSet()
18:46:50.495 INFO DAGScheduler - Submitting ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91), which has no missing parents
18:46:50.502 INFO MemoryStore - Block broadcast_496 stored as values in memory (estimated size 155.4 KiB, free 1915.6 MiB)
18:46:50.502 INFO MemoryStore - Block broadcast_496_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1915.5 MiB)
18:46:50.503 INFO BlockManagerInfo - Added broadcast_496_piece0 in memory on localhost:45727 (size: 58.5 KiB, free: 1919.1 MiB)
18:46:50.503 INFO SparkContext - Created broadcast 496 from broadcast at DAGScheduler.scala:1580
18:46:50.503 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:50.503 INFO TaskSchedulerImpl - Adding task set 246.0 with 1 tasks resource profile 0
18:46:50.503 INFO TaskSetManager - Starting task 0.0 in stage 246.0 (TID 302) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:50.504 INFO Executor - Running task 0.0 in stage 246.0 (TID 302)
18:46:50.507 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:50.507 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:50.518 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:50.518 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:50.518 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:50.518 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:50.518 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:50.518 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:50.519 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:50.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:50.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:50.522 INFO StateChange - BLOCK* allocate blk_1073741905_1081, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/part-r-00000
18:46:50.523 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741905_1081 src: /127.0.0.1:58734 dest: /127.0.0.1:38019
18:46:50.524 INFO clienttrace - src: /127.0.0.1:58734, dest: /127.0.0.1:38019, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741905_1081, duration(ns): 1018321
18:46:50.524 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741905_1081, type=LAST_IN_PIPELINE terminating
18:46:50.525 INFO FSNamesystem - BLOCK* blk_1073741905_1081 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/part-r-00000
18:46:50.926 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:50.926 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
18:46:50.927 INFO StateChange - BLOCK* allocate blk_1073741906_1082, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.sbi
18:46:50.928 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741906_1082 src: /127.0.0.1:58748 dest: /127.0.0.1:38019
18:46:50.929 INFO clienttrace - src: /127.0.0.1:58748, dest: /127.0.0.1:38019, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741906_1082, duration(ns): 398985
18:46:50.929 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741906_1082, type=LAST_IN_PIPELINE terminating
18:46:50.930 INFO FSNamesystem - BLOCK* blk_1073741906_1082 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.sbi
18:46:51.047 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741899_1075 replica FinalizedReplica, blk_1073741899_1075, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741899 for deletion
18:46:51.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741900_1076 replica FinalizedReplica, blk_1073741900_1076, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741900 for deletion
18:46:51.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741899_1075 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741899
18:46:51.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741900_1076 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741900
18:46:51.331 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:51.332 INFO StateChange - BLOCK* allocate blk_1073741907_1083, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.bai
18:46:51.332 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741907_1083 src: /127.0.0.1:58750 dest: /127.0.0.1:38019
18:46:51.334 INFO clienttrace - src: /127.0.0.1:58750, dest: /127.0.0.1:38019, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741907_1083, duration(ns): 376775
18:46:51.334 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741907_1083, type=LAST_IN_PIPELINE terminating
18:46:51.334 INFO FSNamesystem - BLOCK* blk_1073741907_1083 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.bai
18:46:51.735 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:51.736 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0 dst=null perm=null proto=rpc
18:46:51.736 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0 dst=null perm=null proto=rpc
18:46:51.736 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/task_202505191846503689443677806133434_1185_r_000000 dst=null perm=null proto=rpc
18:46:51.737 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/_temporary/attempt_202505191846503689443677806133434_1185_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/task_202505191846503689443677806133434_1185_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:51.737 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846503689443677806133434_1185_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/task_202505191846503689443677806133434_1185_r_000000
18:46:51.737 INFO SparkHadoopMapRedUtil - attempt_202505191846503689443677806133434_1185_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:51.738 INFO Executor - Finished task 0.0 in stage 246.0 (TID 302). 1858 bytes result sent to driver
18:46:51.738 INFO TaskSetManager - Finished task 0.0 in stage 246.0 (TID 302) in 1235 ms on localhost (executor driver) (1/1)
18:46:51.738 INFO TaskSchedulerImpl - Removed TaskSet 246.0, whose tasks have all completed, from pool
18:46:51.738 INFO DAGScheduler - ResultStage 246 (runJob at SparkHadoopWriter.scala:83) finished in 1.242 s
18:46:51.738 INFO DAGScheduler - Job 185 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:51.738 INFO TaskSchedulerImpl - Killing all running tasks in stage 246: Stage finished
18:46:51.738 INFO DAGScheduler - Job 185 finished: runJob at SparkHadoopWriter.scala:83, took 1.306698 s
18:46:51.739 INFO SparkHadoopWriter - Start to commit write Job job_202505191846503689443677806133434_1185.
18:46:51.739 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:51.740 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts dst=null perm=null proto=rpc
18:46:51.740 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/task_202505191846503689443677806133434_1185_r_000000 dst=null perm=null proto=rpc
18:46:51.740 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:51.741 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/task_202505191846503689443677806133434_1185_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.741 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:51.741 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/task_202505191846503689443677806133434_1185_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.742 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:51.742 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary/0/task_202505191846503689443677806133434_1185_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.743 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_temporary dst=null perm=null proto=rpc
18:46:51.743 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.744 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:51.744 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.spark-staging-1185 dst=null perm=null proto=rpc
18:46:51.744 INFO SparkHadoopWriter - Write Job job_202505191846503689443677806133434_1185 committed. Elapsed time: 5 ms.
18:46:51.745 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.746 INFO StateChange - BLOCK* allocate blk_1073741908_1084, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/header
18:46:51.747 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741908_1084 src: /127.0.0.1:58762 dest: /127.0.0.1:38019
18:46:51.748 INFO clienttrace - src: /127.0.0.1:58762, dest: /127.0.0.1:38019, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741908_1084, duration(ns): 383839
18:46:51.748 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741908_1084, type=LAST_IN_PIPELINE terminating
18:46:51.749 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:51.749 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.750 INFO StateChange - BLOCK* allocate blk_1073741909_1085, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/terminator
18:46:51.750 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741909_1085 src: /127.0.0.1:58764 dest: /127.0.0.1:38019
18:46:51.751 INFO clienttrace - src: /127.0.0.1:58764, dest: /127.0.0.1:38019, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741909_1085, duration(ns): 322682
18:46:51.751 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741909_1085, type=LAST_IN_PIPELINE terminating
18:46:51.752 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:51.752 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts dst=null perm=null proto=rpc
18:46:51.753 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.753 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:51.754 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam
18:46:51.754 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.755 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:51.755 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:51.755 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.756 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam done
18:46:51.756 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:51.756 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi
18:46:51.756 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts dst=null perm=null proto=rpc
18:46:51.757 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:51.758 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:51.758 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:51.759 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
18:46:51.760 INFO StateChange - BLOCK* allocate blk_1073741910_1086, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi
18:46:51.761 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741910_1086 src: /127.0.0.1:58770 dest: /127.0.0.1:38019
18:46:51.762 INFO clienttrace - src: /127.0.0.1:58770, dest: /127.0.0.1:38019, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741910_1086, duration(ns): 330808
18:46:51.762 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741910_1086, type=LAST_IN_PIPELINE terminating
18:46:51.762 INFO FSNamesystem - BLOCK* blk_1073741910_1086 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi
18:46:52.163 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:52.164 INFO IndexFileMerger - Done merging .sbi files
18:46:52.164 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/ to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai
18:46:52.164 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts dst=null perm=null proto=rpc
18:46:52.165 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:52.165 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:52.166 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:52.167 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:52.167 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
18:46:52.168 INFO StateChange - BLOCK* allocate blk_1073741911_1087, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai
18:46:52.168 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741911_1087 src: /127.0.0.1:58774 dest: /127.0.0.1:38019
18:46:52.169 INFO clienttrace - src: /127.0.0.1:58774, dest: /127.0.0.1:38019, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741911_1087, duration(ns): 408970
18:46:52.170 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741911_1087, type=LAST_IN_PIPELINE terminating
18:46:52.170 INFO FSNamesystem - BLOCK* blk_1073741911_1087 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai
18:46:52.571 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:52.571 INFO IndexFileMerger - Done merging .bai files
18:46:52.571 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.parts dst=null perm=null proto=rpc
18:46:52.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi dst=null perm=null proto=rpc
18:46:52.588 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi dst=null perm=null proto=rpc
18:46:52.588 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi dst=null perm=null proto=rpc
18:46:52.589 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
18:46:52.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.591 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.591 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.592 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
18:46:52.593 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:52.593 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
18:46:52.594 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi dst=null perm=null proto=rpc
18:46:52.594 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi dst=null perm=null proto=rpc
18:46:52.594 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.sbi dst=null perm=null proto=rpc
18:46:52.595 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
18:46:52.595 INFO MemoryStore - Block broadcast_497 stored as values in memory (estimated size 312.0 B, free 1915.5 MiB)
18:46:52.599 INFO MemoryStore - Block broadcast_497_piece0 stored as bytes in memory (estimated size 231.0 B, free 1915.6 MiB)
18:46:52.599 INFO BlockManagerInfo - Removed broadcast_490_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.1 MiB)
18:46:52.600 INFO BlockManagerInfo - Added broadcast_497_piece0 in memory on localhost:45727 (size: 231.0 B, free: 1919.1 MiB)
18:46:52.600 INFO SparkContext - Created broadcast 497 from broadcast at BamSource.java:104
18:46:52.600 INFO BlockManagerInfo - Removed broadcast_495_piece0 on localhost:45727 in memory (size: 157.6 KiB, free: 1919.3 MiB)
18:46:52.600 INFO BlockManagerInfo - Removed broadcast_482_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:52.600 INFO BlockManagerInfo - Removed broadcast_488_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.4 MiB)
18:46:52.601 INFO BlockManagerInfo - Removed broadcast_494_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.4 MiB)
18:46:52.601 INFO MemoryStore - Block broadcast_498 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
18:46:52.601 INFO BlockManagerInfo - Removed broadcast_493_piece0 on localhost:45727 in memory (size: 1890.0 B, free: 1919.4 MiB)
18:46:52.602 INFO BlockManagerInfo - Removed broadcast_480_piece0 on localhost:45727 in memory (size: 50.3 KiB, free: 1919.4 MiB)
18:46:52.602 INFO BlockManagerInfo - Removed broadcast_492_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:52.603 INFO BlockManagerInfo - Removed broadcast_486_piece0 on localhost:45727 in memory (size: 233.0 B, free: 1919.5 MiB)
18:46:52.603 INFO BlockManagerInfo - Removed broadcast_487_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.5 MiB)
18:46:52.603 INFO BlockManagerInfo - Removed broadcast_484_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.7 MiB)
18:46:52.604 INFO BlockManagerInfo - Removed broadcast_483_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.7 MiB)
18:46:52.604 INFO BlockManagerInfo - Removed broadcast_489_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.8 MiB)
18:46:52.604 INFO BlockManagerInfo - Removed broadcast_496_piece0 on localhost:45727 in memory (size: 58.5 KiB, free: 1919.9 MiB)
18:46:52.605 INFO BlockManagerInfo - Removed broadcast_485_piece0 on localhost:45727 in memory (size: 67.1 KiB, free: 1920.0 MiB)
18:46:52.609 INFO MemoryStore - Block broadcast_498_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
18:46:52.609 INFO BlockManagerInfo - Added broadcast_498_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.9 MiB)
18:46:52.609 INFO SparkContext - Created broadcast 498 from newAPIHadoopFile at PathSplitSource.java:96
18:46:52.617 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.617 INFO FileInputFormat - Total input files to process : 1
18:46:52.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.632 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:52.632 INFO DAGScheduler - Got job 186 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:52.632 INFO DAGScheduler - Final stage: ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:52.632 INFO DAGScheduler - Parents of final stage: List()
18:46:52.632 INFO DAGScheduler - Missing parents: List()
18:46:52.633 INFO DAGScheduler - Submitting ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:52.638 INFO MemoryStore - Block broadcast_499 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
18:46:52.639 INFO MemoryStore - Block broadcast_499_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
18:46:52.639 INFO BlockManagerInfo - Added broadcast_499_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.8 MiB)
18:46:52.639 INFO SparkContext - Created broadcast 499 from broadcast at DAGScheduler.scala:1580
18:46:52.639 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:52.639 INFO TaskSchedulerImpl - Adding task set 247.0 with 1 tasks resource profile 0
18:46:52.640 INFO TaskSetManager - Starting task 0.0 in stage 247.0 (TID 303) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:52.640 INFO Executor - Running task 0.0 in stage 247.0 (TID 303)
18:46:52.652 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam:0+236517
18:46:52.652 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.653 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.653 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.653 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.654 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.655 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
18:46:52.657 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:52.658 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
18:46:52.658 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:52.660 INFO Executor - Finished task 0.0 in stage 247.0 (TID 303). 749470 bytes result sent to driver
18:46:52.662 INFO TaskSetManager - Finished task 0.0 in stage 247.0 (TID 303) in 22 ms on localhost (executor driver) (1/1)
18:46:52.662 INFO TaskSchedulerImpl - Removed TaskSet 247.0, whose tasks have all completed, from pool
18:46:52.662 INFO DAGScheduler - ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
18:46:52.662 INFO DAGScheduler - Job 186 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:52.662 INFO TaskSchedulerImpl - Killing all running tasks in stage 247: Stage finished
18:46:52.662 INFO DAGScheduler - Job 186 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029810 s
18:46:52.674 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:52.674 INFO DAGScheduler - Got job 187 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:52.674 INFO DAGScheduler - Final stage: ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185)
18:46:52.674 INFO DAGScheduler - Parents of final stage: List()
18:46:52.674 INFO DAGScheduler - Missing parents: List()
18:46:52.674 INFO DAGScheduler - Submitting ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:52.691 INFO MemoryStore - Block broadcast_500 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
18:46:52.692 INFO MemoryStore - Block broadcast_500_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
18:46:52.692 INFO BlockManagerInfo - Added broadcast_500_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.7 MiB)
18:46:52.693 INFO SparkContext - Created broadcast 500 from broadcast at DAGScheduler.scala:1580
18:46:52.693 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:52.693 INFO TaskSchedulerImpl - Adding task set 248.0 with 1 tasks resource profile 0
18:46:52.693 INFO TaskSetManager - Starting task 0.0 in stage 248.0 (TID 304) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
18:46:52.693 INFO Executor - Running task 0.0 in stage 248.0 (TID 304)
18:46:52.722 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
18:46:52.729 INFO Executor - Finished task 0.0 in stage 248.0 (TID 304). 989 bytes result sent to driver
18:46:52.729 INFO TaskSetManager - Finished task 0.0 in stage 248.0 (TID 304) in 36 ms on localhost (executor driver) (1/1)
18:46:52.729 INFO TaskSchedulerImpl - Removed TaskSet 248.0, whose tasks have all completed, from pool
18:46:52.729 INFO DAGScheduler - ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.055 s
18:46:52.729 INFO DAGScheduler - Job 187 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:52.729 INFO TaskSchedulerImpl - Killing all running tasks in stage 248: Stage finished
18:46:52.729 INFO DAGScheduler - Job 187 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.055381 s
18:46:52.733 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:52.733 INFO DAGScheduler - Got job 188 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:52.733 INFO DAGScheduler - Final stage: ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185)
18:46:52.733 INFO DAGScheduler - Parents of final stage: List()
18:46:52.733 INFO DAGScheduler - Missing parents: List()
18:46:52.733 INFO DAGScheduler - Submitting ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:52.739 INFO MemoryStore - Block broadcast_501 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
18:46:52.740 INFO MemoryStore - Block broadcast_501_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
18:46:52.740 INFO BlockManagerInfo - Added broadcast_501_piece0 in memory on localhost:45727 (size: 54.6 KiB, free: 1919.6 MiB)
18:46:52.740 INFO SparkContext - Created broadcast 501 from broadcast at DAGScheduler.scala:1580
18:46:52.740 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:52.740 INFO TaskSchedulerImpl - Adding task set 249.0 with 1 tasks resource profile 0
18:46:52.741 INFO TaskSetManager - Starting task 0.0 in stage 249.0 (TID 305) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:52.741 INFO Executor - Running task 0.0 in stage 249.0 (TID 305)
18:46:52.751 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam:0+236517
18:46:52.752 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.752 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam dst=null perm=null proto=rpc
18:46:52.753 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.753 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.753 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_eefaf62e-8242-42d4-b9bb-a51e66981ce6.bam.bai dst=null perm=null proto=rpc
18:46:52.754 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
18:46:52.755 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:52.756 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
18:46:52.757 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
18:46:52.758 INFO Executor - Finished task 0.0 in stage 249.0 (TID 305). 989 bytes result sent to driver
18:46:52.759 INFO TaskSetManager - Finished task 0.0 in stage 249.0 (TID 305) in 19 ms on localhost (executor driver) (1/1)
18:46:52.759 INFO TaskSchedulerImpl - Removed TaskSet 249.0, whose tasks have all completed, from pool
18:46:52.759 INFO DAGScheduler - ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
18:46:52.759 INFO DAGScheduler - Job 188 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:52.759 INFO TaskSchedulerImpl - Killing all running tasks in stage 249: Stage finished
18:46:52.759 INFO DAGScheduler - Job 188 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026290 s
18:46:52.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:52.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:52.768 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:52.768 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:52.770 INFO MemoryStore - Block broadcast_502 stored as values in memory (estimated size 576.0 B, free 1918.4 MiB)
18:46:52.771 INFO MemoryStore - Block broadcast_502_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.4 MiB)
18:46:52.771 INFO BlockManagerInfo - Added broadcast_502_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.6 MiB)
18:46:52.771 INFO SparkContext - Created broadcast 502 from broadcast at CramSource.java:114
18:46:52.772 INFO MemoryStore - Block broadcast_503 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
18:46:52.778 INFO MemoryStore - Block broadcast_503_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
18:46:52.778 INFO BlockManagerInfo - Added broadcast_503_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:52.778 INFO SparkContext - Created broadcast 503 from newAPIHadoopFile at PathSplitSource.java:96
18:46:52.793 INFO MemoryStore - Block broadcast_504 stored as values in memory (estimated size 576.0 B, free 1918.0 MiB)
18:46:52.793 INFO MemoryStore - Block broadcast_504_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.0 MiB)
18:46:52.793 INFO BlockManagerInfo - Added broadcast_504_piece0 in memory on localhost:45727 (size: 228.0 B, free: 1919.6 MiB)
18:46:52.793 INFO SparkContext - Created broadcast 504 from broadcast at CramSource.java:114
18:46:52.794 INFO MemoryStore - Block broadcast_505 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
18:46:52.800 INFO MemoryStore - Block broadcast_505_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
18:46:52.800 INFO BlockManagerInfo - Added broadcast_505_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:52.800 INFO SparkContext - Created broadcast 505 from newAPIHadoopFile at PathSplitSource.java:96
18:46:52.813 INFO FileInputFormat - Total input files to process : 1
18:46:52.815 INFO MemoryStore - Block broadcast_506 stored as values in memory (estimated size 6.0 KiB, free 1917.7 MiB)
18:46:52.815 INFO MemoryStore - Block broadcast_506_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
18:46:52.815 INFO BlockManagerInfo - Added broadcast_506_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.5 MiB)
18:46:52.815 INFO SparkContext - Created broadcast 506 from broadcast at ReadsSparkSink.java:133
18:46:52.816 INFO MemoryStore - Block broadcast_507 stored as values in memory (estimated size 6.2 KiB, free 1917.7 MiB)
18:46:52.816 INFO MemoryStore - Block broadcast_507_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
18:46:52.816 INFO BlockManagerInfo - Added broadcast_507_piece0 in memory on localhost:45727 (size: 1473.0 B, free: 1919.5 MiB)
18:46:52.816 INFO SparkContext - Created broadcast 507 from broadcast at CramSink.java:76
18:46:52.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts dst=null perm=null proto=rpc
18:46:52.819 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:52.819 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:52.819 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:52.820 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:52.826 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:52.826 INFO DAGScheduler - Registering RDD 1203 (mapToPair at SparkUtils.java:161) as input to shuffle 50
18:46:52.826 INFO DAGScheduler - Got job 189 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:52.826 INFO DAGScheduler - Final stage: ResultStage 251 (runJob at SparkHadoopWriter.scala:83)
18:46:52.826 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 250)
18:46:52.826 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 250)
18:46:52.827 INFO DAGScheduler - Submitting ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:52.847 INFO MemoryStore - Block broadcast_508 stored as values in memory (estimated size 292.8 KiB, free 1917.4 MiB)
18:46:52.848 INFO MemoryStore - Block broadcast_508_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.3 MiB)
18:46:52.848 INFO BlockManagerInfo - Added broadcast_508_piece0 in memory on localhost:45727 (size: 107.3 KiB, free: 1919.4 MiB)
18:46:52.848 INFO SparkContext - Created broadcast 508 from broadcast at DAGScheduler.scala:1580
18:46:52.849 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:52.849 INFO TaskSchedulerImpl - Adding task set 250.0 with 1 tasks resource profile 0
18:46:52.849 INFO TaskSetManager - Starting task 0.0 in stage 250.0 (TID 306) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
18:46:52.849 INFO Executor - Running task 0.0 in stage 250.0 (TID 306)
18:46:52.870 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:52.879 INFO Executor - Finished task 0.0 in stage 250.0 (TID 306). 1148 bytes result sent to driver
18:46:52.880 INFO TaskSetManager - Finished task 0.0 in stage 250.0 (TID 306) in 31 ms on localhost (executor driver) (1/1)
18:46:52.880 INFO TaskSchedulerImpl - Removed TaskSet 250.0, whose tasks have all completed, from pool
18:46:52.880 INFO DAGScheduler - ShuffleMapStage 250 (mapToPair at SparkUtils.java:161) finished in 0.053 s
18:46:52.880 INFO DAGScheduler - looking for newly runnable stages
18:46:52.880 INFO DAGScheduler - running: HashSet()
18:46:52.880 INFO DAGScheduler - waiting: HashSet(ResultStage 251)
18:46:52.880 INFO DAGScheduler - failed: HashSet()
18:46:52.880 INFO DAGScheduler - Submitting ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89), which has no missing parents
18:46:52.886 INFO MemoryStore - Block broadcast_509 stored as values in memory (estimated size 153.3 KiB, free 1917.1 MiB)
18:46:52.887 INFO MemoryStore - Block broadcast_509_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1917.1 MiB)
18:46:52.887 INFO BlockManagerInfo - Added broadcast_509_piece0 in memory on localhost:45727 (size: 58.1 KiB, free: 1919.4 MiB)
18:46:52.887 INFO SparkContext - Created broadcast 509 from broadcast at DAGScheduler.scala:1580
18:46:52.888 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
18:46:52.888 INFO TaskSchedulerImpl - Adding task set 251.0 with 1 tasks resource profile 0
18:46:52.888 INFO TaskSetManager - Starting task 0.0 in stage 251.0 (TID 307) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:52.888 INFO Executor - Running task 0.0 in stage 251.0 (TID 307)
18:46:52.892 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:52.892 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:52.898 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:52.898 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:52.898 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:52.898 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:52.898 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:52.898 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:52.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/_temporary/attempt_202505191846521733162188069757979_1208_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:52.921 INFO StateChange - BLOCK* allocate blk_1073741912_1088, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/_temporary/attempt_202505191846521733162188069757979_1208_r_000000_0/part-r-00000
18:46:52.922 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741912_1088 src: /127.0.0.1:58790 dest: /127.0.0.1:38019
18:46:52.923 INFO clienttrace - src: /127.0.0.1:58790, dest: /127.0.0.1:38019, bytes: 42659, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741912_1088, duration(ns): 483835
18:46:52.923 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741912_1088, type=LAST_IN_PIPELINE terminating
18:46:52.924 INFO FSNamesystem - BLOCK* blk_1073741912_1088 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/_temporary/attempt_202505191846521733162188069757979_1208_r_000000_0/part-r-00000
18:46:53.324 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/_temporary/attempt_202505191846521733162188069757979_1208_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:53.325 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/_temporary/attempt_202505191846521733162188069757979_1208_r_000000_0 dst=null perm=null proto=rpc
18:46:53.326 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/_temporary/attempt_202505191846521733162188069757979_1208_r_000000_0 dst=null perm=null proto=rpc
18:46:53.326 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/task_202505191846521733162188069757979_1208_r_000000 dst=null perm=null proto=rpc
18:46:53.327 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/_temporary/attempt_202505191846521733162188069757979_1208_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/task_202505191846521733162188069757979_1208_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:53.327 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846521733162188069757979_1208_r_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/task_202505191846521733162188069757979_1208_r_000000
18:46:53.327 INFO SparkHadoopMapRedUtil - attempt_202505191846521733162188069757979_1208_r_000000_0: Committed. Elapsed time: 1 ms.
18:46:53.327 INFO Executor - Finished task 0.0 in stage 251.0 (TID 307). 1858 bytes result sent to driver
18:46:53.328 INFO TaskSetManager - Finished task 0.0 in stage 251.0 (TID 307) in 440 ms on localhost (executor driver) (1/1)
18:46:53.328 INFO TaskSchedulerImpl - Removed TaskSet 251.0, whose tasks have all completed, from pool
18:46:53.328 INFO DAGScheduler - ResultStage 251 (runJob at SparkHadoopWriter.scala:83) finished in 0.448 s
18:46:53.328 INFO DAGScheduler - Job 189 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:53.328 INFO TaskSchedulerImpl - Killing all running tasks in stage 251: Stage finished
18:46:53.328 INFO DAGScheduler - Job 189 finished: runJob at SparkHadoopWriter.scala:83, took 0.502499 s
18:46:53.329 INFO SparkHadoopWriter - Start to commit write Job job_202505191846521733162188069757979_1208.
18:46:53.329 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:53.329 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts dst=null perm=null proto=rpc
18:46:53.330 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/task_202505191846521733162188069757979_1208_r_000000 dst=null perm=null proto=rpc
18:46:53.330 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/part-r-00000 dst=null perm=null proto=rpc
18:46:53.331 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary/0/task_202505191846521733162188069757979_1208_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:53.331 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_temporary dst=null perm=null proto=rpc
18:46:53.332 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:53.333 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:53.333 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/.spark-staging-1208 dst=null perm=null proto=rpc
18:46:53.333 INFO SparkHadoopWriter - Write Job job_202505191846521733162188069757979_1208 committed. Elapsed time: 4 ms.
18:46:53.334 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:53.336 INFO StateChange - BLOCK* allocate blk_1073741913_1089, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/header
18:46:53.336 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741913_1089 src: /127.0.0.1:58806 dest: /127.0.0.1:38019
18:46:53.338 INFO clienttrace - src: /127.0.0.1:58806, dest: /127.0.0.1:38019, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741913_1089, duration(ns): 404194
18:46:53.338 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741913_1089, type=LAST_IN_PIPELINE terminating
18:46:53.338 INFO FSNamesystem - BLOCK* blk_1073741913_1089 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/header
18:46:53.739 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:53.740 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:53.741 INFO StateChange - BLOCK* allocate blk_1073741914_1090, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/terminator
18:46:53.742 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741914_1090 src: /127.0.0.1:58816 dest: /127.0.0.1:38019
18:46:53.743 INFO clienttrace - src: /127.0.0.1:58816, dest: /127.0.0.1:38019, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741914_1090, duration(ns): 356703
18:46:53.743 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741914_1090, type=LAST_IN_PIPELINE terminating
18:46:53.743 INFO FSNamesystem - BLOCK* blk_1073741914_1090 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/terminator
18:46:54.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741906_1082 replica FinalizedReplica, blk_1073741906_1082, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data2
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741906 for deletion
18:46:54.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741907_1083 replica FinalizedReplica, blk_1073741907_1083, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage13238592372457082651/data/data1
getBlockURI() = file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741907 for deletion
18:46:54.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741906_1082 URI file:/tmp/minicluster_storage13238592372457082651/data/data2/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741906
18:46:54.048 INFO FsDatasetAsyncDiskService - Deleted BP-1968466779-10.1.0.176-1747680367669 blk_1073741907_1083 URI file:/tmp/minicluster_storage13238592372457082651/data/data1/current/BP-1968466779-10.1.0.176-1747680367669/current/finalized/subdir0/subdir0/blk_1073741907
18:46:54.144 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:54.145 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts dst=null perm=null proto=rpc
18:46:54.146 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.146 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:54.146 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram
18:46:54.147 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.148 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.148 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.148 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.149 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram done
18:46:54.149 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.parts dst=null perm=null proto=rpc
18:46:54.149 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.151 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.crai dst=null perm=null proto=rpc
18:46:54.151 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.crai dst=null perm=null proto=rpc
18:46:54.153 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
18:46:54.154 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:54.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.155 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.crai dst=null perm=null proto=rpc
18:46:54.155 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.crai dst=null perm=null proto=rpc
18:46:54.155 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.156 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.156 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
18:46:54.157 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
18:46:54.157 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:54.158 INFO MemoryStore - Block broadcast_510 stored as values in memory (estimated size 528.0 B, free 1917.1 MiB)
18:46:54.162 INFO MemoryStore - Block broadcast_510_piece0 stored as bytes in memory (estimated size 187.0 B, free 1917.1 MiB)
18:46:54.162 INFO BlockManagerInfo - Added broadcast_510_piece0 in memory on localhost:45727 (size: 187.0 B, free: 1919.4 MiB)
18:46:54.162 INFO BlockManagerInfo - Removed broadcast_498_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:54.162 INFO SparkContext - Created broadcast 510 from broadcast at CramSource.java:114
18:46:54.162 INFO BlockManagerInfo - Removed broadcast_507_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.4 MiB)
18:46:54.163 INFO BlockManagerInfo - Removed broadcast_499_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.5 MiB)
18:46:54.163 INFO BlockManagerInfo - Removed broadcast_508_piece0 on localhost:45727 in memory (size: 107.3 KiB, free: 1919.6 MiB)
18:46:54.163 INFO MemoryStore - Block broadcast_511 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
18:46:54.164 INFO BlockManagerInfo - Removed broadcast_506_piece0 on localhost:45727 in memory (size: 1473.0 B, free: 1919.6 MiB)
18:46:54.165 INFO BlockManagerInfo - Removed broadcast_500_piece0 on localhost:45727 in memory (size: 153.6 KiB, free: 1919.7 MiB)
18:46:54.165 INFO BlockManagerInfo - Removed broadcast_491_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.8 MiB)
18:46:54.166 INFO BlockManagerInfo - Removed broadcast_501_piece0 on localhost:45727 in memory (size: 54.6 KiB, free: 1919.8 MiB)
18:46:54.166 INFO BlockManagerInfo - Removed broadcast_505_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.9 MiB)
18:46:54.167 INFO BlockManagerInfo - Removed broadcast_509_piece0 on localhost:45727 in memory (size: 58.1 KiB, free: 1920.0 MiB)
18:46:54.167 INFO BlockManagerInfo - Removed broadcast_497_piece0 on localhost:45727 in memory (size: 231.0 B, free: 1920.0 MiB)
18:46:54.168 INFO BlockManagerInfo - Removed broadcast_504_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1920.0 MiB)
18:46:54.173 INFO MemoryStore - Block broadcast_511_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
18:46:54.174 INFO BlockManagerInfo - Added broadcast_511_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.9 MiB)
18:46:54.174 INFO SparkContext - Created broadcast 511 from newAPIHadoopFile at PathSplitSource.java:96
18:46:54.188 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.188 INFO FileInputFormat - Total input files to process : 1
18:46:54.189 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.216 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:54.217 INFO DAGScheduler - Got job 190 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:54.217 INFO DAGScheduler - Final stage: ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:54.217 INFO DAGScheduler - Parents of final stage: List()
18:46:54.217 INFO DAGScheduler - Missing parents: List()
18:46:54.217 INFO DAGScheduler - Submitting ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:54.228 INFO MemoryStore - Block broadcast_512 stored as values in memory (estimated size 286.8 KiB, free 1919.0 MiB)
18:46:54.229 INFO MemoryStore - Block broadcast_512_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.9 MiB)
18:46:54.229 INFO BlockManagerInfo - Added broadcast_512_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.8 MiB)
18:46:54.229 INFO SparkContext - Created broadcast 512 from broadcast at DAGScheduler.scala:1580
18:46:54.230 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:54.230 INFO TaskSchedulerImpl - Adding task set 252.0 with 1 tasks resource profile 0
18:46:54.230 INFO TaskSetManager - Starting task 0.0 in stage 252.0 (TID 308) (localhost, executor driver, partition 0, ANY, 7853 bytes)
18:46:54.230 INFO Executor - Running task 0.0 in stage 252.0 (TID 308)
18:46:54.251 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram:0+43713
18:46:54.252 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.252 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.crai dst=null perm=null proto=rpc
18:46:54.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.crai dst=null perm=null proto=rpc
18:46:54.255 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
18:46:54.255 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
18:46:54.255 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:54.266 INFO Executor - Finished task 0.0 in stage 252.0 (TID 308). 154058 bytes result sent to driver
18:46:54.266 INFO TaskSetManager - Finished task 0.0 in stage 252.0 (TID 308) in 36 ms on localhost (executor driver) (1/1)
18:46:54.266 INFO TaskSchedulerImpl - Removed TaskSet 252.0, whose tasks have all completed, from pool
18:46:54.266 INFO DAGScheduler - ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.049 s
18:46:54.266 INFO DAGScheduler - Job 190 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:54.267 INFO TaskSchedulerImpl - Killing all running tasks in stage 252: Stage finished
18:46:54.267 INFO DAGScheduler - Job 190 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.050352 s
18:46:54.272 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:54.272 INFO DAGScheduler - Got job 191 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:54.272 INFO DAGScheduler - Final stage: ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185)
18:46:54.272 INFO DAGScheduler - Parents of final stage: List()
18:46:54.272 INFO DAGScheduler - Missing parents: List()
18:46:54.272 INFO DAGScheduler - Submitting ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:54.284 INFO MemoryStore - Block broadcast_513 stored as values in memory (estimated size 286.8 KiB, free 1918.7 MiB)
18:46:54.285 INFO MemoryStore - Block broadcast_513_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.6 MiB)
18:46:54.285 INFO BlockManagerInfo - Added broadcast_513_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.7 MiB)
18:46:54.285 INFO SparkContext - Created broadcast 513 from broadcast at DAGScheduler.scala:1580
18:46:54.285 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:54.285 INFO TaskSchedulerImpl - Adding task set 253.0 with 1 tasks resource profile 0
18:46:54.286 INFO TaskSetManager - Starting task 0.0 in stage 253.0 (TID 309) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
18:46:54.286 INFO Executor - Running task 0.0 in stage 253.0 (TID 309)
18:46:54.307 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
18:46:54.312 INFO Executor - Finished task 0.0 in stage 253.0 (TID 309). 989 bytes result sent to driver
18:46:54.313 INFO TaskSetManager - Finished task 0.0 in stage 253.0 (TID 309) in 27 ms on localhost (executor driver) (1/1)
18:46:54.313 INFO TaskSchedulerImpl - Removed TaskSet 253.0, whose tasks have all completed, from pool
18:46:54.313 INFO DAGScheduler - ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.041 s
18:46:54.313 INFO DAGScheduler - Job 191 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:54.313 INFO TaskSchedulerImpl - Killing all running tasks in stage 253: Stage finished
18:46:54.313 INFO DAGScheduler - Job 191 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.041495 s
18:46:54.316 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:54.317 INFO DAGScheduler - Got job 192 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:54.317 INFO DAGScheduler - Final stage: ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185)
18:46:54.317 INFO DAGScheduler - Parents of final stage: List()
18:46:54.317 INFO DAGScheduler - Missing parents: List()
18:46:54.317 INFO DAGScheduler - Submitting ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:54.328 INFO MemoryStore - Block broadcast_514 stored as values in memory (estimated size 286.8 KiB, free 1918.3 MiB)
18:46:54.329 INFO MemoryStore - Block broadcast_514_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.2 MiB)
18:46:54.329 INFO BlockManagerInfo - Added broadcast_514_piece0 in memory on localhost:45727 (size: 103.6 KiB, free: 1919.6 MiB)
18:46:54.330 INFO SparkContext - Created broadcast 514 from broadcast at DAGScheduler.scala:1580
18:46:54.330 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:54.330 INFO TaskSchedulerImpl - Adding task set 254.0 with 1 tasks resource profile 0
18:46:54.330 INFO TaskSetManager - Starting task 0.0 in stage 254.0 (TID 310) (localhost, executor driver, partition 0, ANY, 7853 bytes)
18:46:54.330 INFO Executor - Running task 0.0 in stage 254.0 (TID 310)
18:46:54.350 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram:0+43713
18:46:54.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram dst=null perm=null proto=rpc
18:46:54.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.cram.crai dst=null perm=null proto=rpc
18:46:54.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_44a6d399-0aa3-4c18-bf6f-03fb8ad76cad.crai dst=null perm=null proto=rpc
18:46:54.353 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
18:46:54.354 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
18:46:54.354 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
18:46:54.361 INFO Executor - Finished task 0.0 in stage 254.0 (TID 310). 989 bytes result sent to driver
18:46:54.362 INFO TaskSetManager - Finished task 0.0 in stage 254.0 (TID 310) in 32 ms on localhost (executor driver) (1/1)
18:46:54.362 INFO TaskSchedulerImpl - Removed TaskSet 254.0, whose tasks have all completed, from pool
18:46:54.362 INFO DAGScheduler - ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.045 s
18:46:54.362 INFO DAGScheduler - Job 192 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:54.362 INFO TaskSchedulerImpl - Killing all running tasks in stage 254: Stage finished
18:46:54.362 INFO DAGScheduler - Job 192 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.045623 s
18:46:54.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.376 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.377 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:54.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.380 INFO MemoryStore - Block broadcast_515 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
18:46:54.390 INFO MemoryStore - Block broadcast_515_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
18:46:54.390 INFO BlockManagerInfo - Added broadcast_515_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:54.391 INFO SparkContext - Created broadcast 515 from newAPIHadoopFile at PathSplitSource.java:96
18:46:54.420 INFO MemoryStore - Block broadcast_516 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
18:46:54.427 INFO MemoryStore - Block broadcast_516_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
18:46:54.427 INFO BlockManagerInfo - Added broadcast_516_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.5 MiB)
18:46:54.427 INFO SparkContext - Created broadcast 516 from newAPIHadoopFile at PathSplitSource.java:96
18:46:54.447 INFO FileInputFormat - Total input files to process : 1
18:46:54.449 INFO MemoryStore - Block broadcast_517 stored as values in memory (estimated size 160.7 KiB, free 1917.3 MiB)
18:46:54.449 INFO MemoryStore - Block broadcast_517_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
18:46:54.449 INFO BlockManagerInfo - Added broadcast_517_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.5 MiB)
18:46:54.450 INFO SparkContext - Created broadcast 517 from broadcast at ReadsSparkSink.java:133
18:46:54.453 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts dst=null perm=null proto=rpc
18:46:54.454 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:54.454 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:54.454 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:54.454 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:54.460 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:54.461 INFO DAGScheduler - Registering RDD 1228 (mapToPair at SparkUtils.java:161) as input to shuffle 51
18:46:54.461 INFO DAGScheduler - Got job 193 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:54.461 INFO DAGScheduler - Final stage: ResultStage 256 (runJob at SparkHadoopWriter.scala:83)
18:46:54.461 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 255)
18:46:54.461 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 255)
18:46:54.461 INFO DAGScheduler - Submitting ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:54.480 INFO MemoryStore - Block broadcast_518 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
18:46:54.482 INFO MemoryStore - Block broadcast_518_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
18:46:54.482 INFO BlockManagerInfo - Added broadcast_518_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.3 MiB)
18:46:54.482 INFO SparkContext - Created broadcast 518 from broadcast at DAGScheduler.scala:1580
18:46:54.482 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:54.482 INFO TaskSchedulerImpl - Adding task set 255.0 with 1 tasks resource profile 0
18:46:54.482 INFO TaskSetManager - Starting task 0.0 in stage 255.0 (TID 311) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:54.483 INFO Executor - Running task 0.0 in stage 255.0 (TID 311)
18:46:54.511 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:54.526 INFO Executor - Finished task 0.0 in stage 255.0 (TID 311). 1148 bytes result sent to driver
18:46:54.526 INFO TaskSetManager - Finished task 0.0 in stage 255.0 (TID 311) in 44 ms on localhost (executor driver) (1/1)
18:46:54.526 INFO TaskSchedulerImpl - Removed TaskSet 255.0, whose tasks have all completed, from pool
18:46:54.526 INFO DAGScheduler - ShuffleMapStage 255 (mapToPair at SparkUtils.java:161) finished in 0.065 s
18:46:54.526 INFO DAGScheduler - looking for newly runnable stages
18:46:54.526 INFO DAGScheduler - running: HashSet()
18:46:54.526 INFO DAGScheduler - waiting: HashSet(ResultStage 256)
18:46:54.526 INFO DAGScheduler - failed: HashSet()
18:46:54.526 INFO DAGScheduler - Submitting ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65), which has no missing parents
18:46:54.533 INFO MemoryStore - Block broadcast_519 stored as values in memory (estimated size 241.1 KiB, free 1916.4 MiB)
18:46:54.534 INFO MemoryStore - Block broadcast_519_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.4 MiB)
18:46:54.534 INFO BlockManagerInfo - Added broadcast_519_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.3 MiB)
18:46:54.534 INFO SparkContext - Created broadcast 519 from broadcast at DAGScheduler.scala:1580
18:46:54.534 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
18:46:54.534 INFO TaskSchedulerImpl - Adding task set 256.0 with 1 tasks resource profile 0
18:46:54.535 INFO TaskSetManager - Starting task 0.0 in stage 256.0 (TID 312) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:54.535 INFO Executor - Running task 0.0 in stage 256.0 (TID 312)
18:46:54.539 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:54.539 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:54.549 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
18:46:54.549 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:54.549 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:54.550 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/_temporary/attempt_202505191846545029178356309085034_1234_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.552 INFO StateChange - BLOCK* allocate blk_1073741915_1091, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/_temporary/attempt_202505191846545029178356309085034_1234_m_000000_0/part-00000
18:46:54.553 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741915_1091 src: /127.0.0.1:58818 dest: /127.0.0.1:38019
18:46:54.558 INFO clienttrace - src: /127.0.0.1:58818, dest: /127.0.0.1:38019, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741915_1091, duration(ns): 5097132
18:46:54.559 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741915_1091, type=LAST_IN_PIPELINE terminating
18:46:54.559 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/_temporary/attempt_202505191846545029178356309085034_1234_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:54.560 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/_temporary/attempt_202505191846545029178356309085034_1234_m_000000_0 dst=null perm=null proto=rpc
18:46:54.560 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/_temporary/attempt_202505191846545029178356309085034_1234_m_000000_0 dst=null perm=null proto=rpc
18:46:54.561 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/task_202505191846545029178356309085034_1234_m_000000 dst=null perm=null proto=rpc
18:46:54.561 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/_temporary/attempt_202505191846545029178356309085034_1234_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/task_202505191846545029178356309085034_1234_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
18:46:54.561 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846545029178356309085034_1234_m_000000_0' to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/task_202505191846545029178356309085034_1234_m_000000
18:46:54.561 INFO SparkHadoopMapRedUtil - attempt_202505191846545029178356309085034_1234_m_000000_0: Committed. Elapsed time: 1 ms.
18:46:54.562 INFO Executor - Finished task 0.0 in stage 256.0 (TID 312). 1858 bytes result sent to driver
18:46:54.562 INFO TaskSetManager - Finished task 0.0 in stage 256.0 (TID 312) in 28 ms on localhost (executor driver) (1/1)
18:46:54.562 INFO TaskSchedulerImpl - Removed TaskSet 256.0, whose tasks have all completed, from pool
18:46:54.562 INFO DAGScheduler - ResultStage 256 (runJob at SparkHadoopWriter.scala:83) finished in 0.035 s
18:46:54.562 INFO DAGScheduler - Job 193 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:54.562 INFO TaskSchedulerImpl - Killing all running tasks in stage 256: Stage finished
18:46:54.562 INFO DAGScheduler - Job 193 finished: runJob at SparkHadoopWriter.scala:83, took 0.102172 s
18:46:54.563 INFO SparkHadoopWriter - Start to commit write Job job_202505191846545029178356309085034_1234.
18:46:54.563 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0 dst=null perm=null proto=rpc
18:46:54.564 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts dst=null perm=null proto=rpc
18:46:54.564 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/task_202505191846545029178356309085034_1234_m_000000 dst=null perm=null proto=rpc
18:46:54.565 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/part-00000 dst=null perm=null proto=rpc
18:46:54.565 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary/0/task_202505191846545029178356309085034_1234_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.566 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_temporary dst=null perm=null proto=rpc
18:46:54.566 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.567 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:54.567 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/.spark-staging-1234 dst=null perm=null proto=rpc
18:46:54.568 INFO SparkHadoopWriter - Write Job job_202505191846545029178356309085034_1234 committed. Elapsed time: 4 ms.
18:46:54.568 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.569 INFO StateChange - BLOCK* allocate blk_1073741916_1092, replicas=127.0.0.1:38019 for /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/header
18:46:54.570 INFO DataNode - Receiving BP-1968466779-10.1.0.176-1747680367669:blk_1073741916_1092 src: /127.0.0.1:58826 dest: /127.0.0.1:38019
18:46:54.571 INFO clienttrace - src: /127.0.0.1:58826, dest: /127.0.0.1:38019, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1789236052_1, offset: 0, srvID: f8ef6dd4-baf0-4c1e-9770-2b8cf408ab4d, blockid: BP-1968466779-10.1.0.176-1747680367669:blk_1073741916_1092, duration(ns): 490838
18:46:54.571 INFO DataNode - PacketResponder: BP-1968466779-10.1.0.176-1747680367669:blk_1073741916_1092, type=LAST_IN_PIPELINE terminating
18:46:54.572 INFO FSNamesystem - BLOCK* blk_1073741916_1092 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/header
18:46:54.972 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:54.973 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts dst=null perm=null proto=rpc
18:46:54.974 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.974 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-1789236052_1
18:46:54.975 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam
18:46:54.975 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.975 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.976 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.976 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam perm=runner:supergroup:rw-r--r-- proto=rpc
18:46:54.976 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam done
18:46:54.977 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam.parts dst=null perm=null proto=rpc
18:46:54.977 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.977 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.978 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.978 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
WARNING 2025-05-19 18:46:54 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:54.980 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
18:46:54.981 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
18:46:54.981 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.981 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.982 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
WARNING 2025-05-19 18:46:54 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
18:46:54.983 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
18:46:54.984 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
18:46:54.984 INFO MemoryStore - Block broadcast_520 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
18:46:54.985 INFO MemoryStore - Block broadcast_520_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
18:46:54.985 INFO BlockManagerInfo - Added broadcast_520_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.3 MiB)
18:46:54.985 INFO SparkContext - Created broadcast 520 from broadcast at SamSource.java:78
18:46:54.986 INFO MemoryStore - Block broadcast_521 stored as values in memory (estimated size 297.9 KiB, free 1915.9 MiB)
18:46:54.992 INFO MemoryStore - Block broadcast_521_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.9 MiB)
18:46:54.992 INFO BlockManagerInfo - Added broadcast_521_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.2 MiB)
18:46:54.993 INFO SparkContext - Created broadcast 521 from newAPIHadoopFile at SamSource.java:108
18:46:54.995 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:54.995 INFO FileInputFormat - Total input files to process : 1
18:46:54.995 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:55.000 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:55.000 INFO DAGScheduler - Got job 194 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:55.000 INFO DAGScheduler - Final stage: ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:55.000 INFO DAGScheduler - Parents of final stage: List()
18:46:55.000 INFO DAGScheduler - Missing parents: List()
18:46:55.000 INFO DAGScheduler - Submitting ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:55.000 INFO MemoryStore - Block broadcast_522 stored as values in memory (estimated size 7.5 KiB, free 1915.8 MiB)
18:46:55.001 INFO MemoryStore - Block broadcast_522_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1915.8 MiB)
18:46:55.001 INFO BlockManagerInfo - Added broadcast_522_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.2 MiB)
18:46:55.001 INFO SparkContext - Created broadcast 522 from broadcast at DAGScheduler.scala:1580
18:46:55.001 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:55.001 INFO TaskSchedulerImpl - Adding task set 257.0 with 1 tasks resource profile 0
18:46:55.002 INFO TaskSetManager - Starting task 0.0 in stage 257.0 (TID 313) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:55.002 INFO Executor - Running task 0.0 in stage 257.0 (TID 313)
18:46:55.003 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam:0+847558
18:46:55.004 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:55.005 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
18:46:55.010 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
18:46:55.015 INFO Executor - Finished task 0.0 in stage 257.0 (TID 313). 651483 bytes result sent to driver
18:46:55.017 INFO TaskSetManager - Finished task 0.0 in stage 257.0 (TID 313) in 16 ms on localhost (executor driver) (1/1)
18:46:55.017 INFO TaskSchedulerImpl - Removed TaskSet 257.0, whose tasks have all completed, from pool
18:46:55.017 INFO DAGScheduler - ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.017 s
18:46:55.017 INFO DAGScheduler - Job 194 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:55.017 INFO TaskSchedulerImpl - Killing all running tasks in stage 257: Stage finished
18:46:55.017 INFO DAGScheduler - Job 194 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.017676 s
18:46:55.026 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:55.027 INFO DAGScheduler - Got job 195 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:55.027 INFO DAGScheduler - Final stage: ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185)
18:46:55.027 INFO DAGScheduler - Parents of final stage: List()
18:46:55.027 INFO DAGScheduler - Missing parents: List()
18:46:55.027 INFO DAGScheduler - Submitting ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:55.043 INFO MemoryStore - Block broadcast_523 stored as values in memory (estimated size 426.1 KiB, free 1915.4 MiB)
18:46:55.049 INFO BlockManagerInfo - Removed broadcast_503_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.2 MiB)
18:46:55.049 INFO BlockManagerInfo - Removed broadcast_511_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.3 MiB)
18:46:55.049 INFO MemoryStore - Block broadcast_523_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.0 MiB)
18:46:55.049 INFO BlockManagerInfo - Added broadcast_523_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.1 MiB)
18:46:55.050 INFO SparkContext - Created broadcast 523 from broadcast at DAGScheduler.scala:1580
18:46:55.050 INFO BlockManagerInfo - Removed broadcast_522_piece0 on localhost:45727 in memory (size: 3.8 KiB, free: 1919.2 MiB)
18:46:55.050 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:55.050 INFO TaskSchedulerImpl - Adding task set 258.0 with 1 tasks resource profile 0
18:46:55.050 INFO BlockManagerInfo - Removed broadcast_510_piece0 on localhost:45727 in memory (size: 187.0 B, free: 1919.2 MiB)
18:46:55.051 INFO TaskSetManager - Starting task 0.0 in stage 258.0 (TID 314) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:55.051 INFO Executor - Running task 0.0 in stage 258.0 (TID 314)
18:46:55.051 INFO BlockManagerInfo - Removed broadcast_502_piece0 on localhost:45727 in memory (size: 228.0 B, free: 1919.2 MiB)
18:46:55.052 INFO BlockManagerInfo - Removed broadcast_514_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.3 MiB)
18:46:55.053 INFO BlockManagerInfo - Removed broadcast_517_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:55.053 INFO BlockManagerInfo - Removed broadcast_512_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.4 MiB)
18:46:55.054 INFO BlockManagerInfo - Removed broadcast_516_piece0 on localhost:45727 in memory (size: 50.2 KiB, free: 1919.4 MiB)
18:46:55.054 INFO BlockManagerInfo - Removed broadcast_513_piece0 on localhost:45727 in memory (size: 103.6 KiB, free: 1919.5 MiB)
18:46:55.055 INFO BlockManagerInfo - Removed broadcast_518_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.7 MiB)
18:46:55.055 INFO BlockManagerInfo - Removed broadcast_519_piece0 on localhost:45727 in memory (size: 67.0 KiB, free: 1919.7 MiB)
18:46:55.082 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:55.091 INFO Executor - Finished task 0.0 in stage 258.0 (TID 314). 989 bytes result sent to driver
18:46:55.091 INFO TaskSetManager - Finished task 0.0 in stage 258.0 (TID 314) in 41 ms on localhost (executor driver) (1/1)
18:46:55.091 INFO TaskSchedulerImpl - Removed TaskSet 258.0, whose tasks have all completed, from pool
18:46:55.092 INFO DAGScheduler - ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
18:46:55.092 INFO DAGScheduler - Job 195 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:55.092 INFO TaskSchedulerImpl - Killing all running tasks in stage 258: Stage finished
18:46:55.092 INFO DAGScheduler - Job 195 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065358 s
18:46:55.095 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:55.095 INFO DAGScheduler - Got job 196 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:55.095 INFO DAGScheduler - Final stage: ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185)
18:46:55.095 INFO DAGScheduler - Parents of final stage: List()
18:46:55.095 INFO DAGScheduler - Missing parents: List()
18:46:55.095 INFO DAGScheduler - Submitting ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:55.096 INFO MemoryStore - Block broadcast_524 stored as values in memory (estimated size 7.4 KiB, free 1918.6 MiB)
18:46:55.096 INFO MemoryStore - Block broadcast_524_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.6 MiB)
18:46:55.096 INFO BlockManagerInfo - Added broadcast_524_piece0 in memory on localhost:45727 (size: 3.8 KiB, free: 1919.7 MiB)
18:46:55.096 INFO SparkContext - Created broadcast 524 from broadcast at DAGScheduler.scala:1580
18:46:55.097 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:55.097 INFO TaskSchedulerImpl - Adding task set 259.0 with 1 tasks resource profile 0
18:46:55.097 INFO TaskSetManager - Starting task 0.0 in stage 259.0 (TID 315) (localhost, executor driver, partition 0, ANY, 7852 bytes)
18:46:55.097 INFO Executor - Running task 0.0 in stage 259.0 (TID 315)
18:46:55.098 INFO NewHadoopRDD - Input split: hdfs://localhost:36797/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam:0+847558
18:46:55.100 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_6c9f1e53-d0ad-4c3b-94fd-5a23820c7905.sam dst=null perm=null proto=rpc
18:46:55.101 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
18:46:55.107 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
18:46:55.107 INFO Executor - Finished task 0.0 in stage 259.0 (TID 315). 946 bytes result sent to driver
18:46:55.107 INFO TaskSetManager - Finished task 0.0 in stage 259.0 (TID 315) in 10 ms on localhost (executor driver) (1/1)
18:46:55.108 INFO TaskSchedulerImpl - Removed TaskSet 259.0, whose tasks have all completed, from pool
18:46:55.108 INFO DAGScheduler - ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.012 s
18:46:55.108 INFO DAGScheduler - Job 196 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:55.108 INFO TaskSchedulerImpl - Killing all running tasks in stage 259: Stage finished
18:46:55.108 INFO DAGScheduler - Job 196 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.012803 s
18:46:55.110 INFO MemoryStore - Block broadcast_525 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
18:46:55.116 INFO MemoryStore - Block broadcast_525_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
18:46:55.116 INFO BlockManagerInfo - Added broadcast_525_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.7 MiB)
18:46:55.117 INFO SparkContext - Created broadcast 525 from newAPIHadoopFile at PathSplitSource.java:96
18:46:55.139 INFO MemoryStore - Block broadcast_526 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
18:46:55.145 INFO MemoryStore - Block broadcast_526_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
18:46:55.145 INFO BlockManagerInfo - Added broadcast_526_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.6 MiB)
18:46:55.145 INFO SparkContext - Created broadcast 526 from newAPIHadoopFile at PathSplitSource.java:96
18:46:55.165 INFO FileInputFormat - Total input files to process : 1
18:46:55.167 INFO MemoryStore - Block broadcast_527 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
18:46:55.167 INFO MemoryStore - Block broadcast_527_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
18:46:55.168 INFO BlockManagerInfo - Added broadcast_527_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:55.168 INFO SparkContext - Created broadcast 527 from broadcast at ReadsSparkSink.java:133
18:46:55.169 INFO MemoryStore - Block broadcast_528 stored as values in memory (estimated size 163.2 KiB, free 1917.6 MiB)
18:46:55.169 INFO MemoryStore - Block broadcast_528_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.6 MiB)
18:46:55.170 INFO BlockManagerInfo - Added broadcast_528_piece0 in memory on localhost:45727 (size: 9.6 KiB, free: 1919.6 MiB)
18:46:55.170 INFO SparkContext - Created broadcast 528 from broadcast at BamSink.java:76
18:46:55.171 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:55.171 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:55.171 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:55.188 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
18:46:55.188 INFO DAGScheduler - Registering RDD 1253 (mapToPair at SparkUtils.java:161) as input to shuffle 52
18:46:55.188 INFO DAGScheduler - Got job 197 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
18:46:55.188 INFO DAGScheduler - Final stage: ResultStage 261 (runJob at SparkHadoopWriter.scala:83)
18:46:55.188 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 260)
18:46:55.188 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 260)
18:46:55.188 INFO DAGScheduler - Submitting ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161), which has no missing parents
18:46:55.206 INFO MemoryStore - Block broadcast_529 stored as values in memory (estimated size 520.4 KiB, free 1917.1 MiB)
18:46:55.207 INFO MemoryStore - Block broadcast_529_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.9 MiB)
18:46:55.207 INFO BlockManagerInfo - Added broadcast_529_piece0 in memory on localhost:45727 (size: 166.1 KiB, free: 1919.5 MiB)
18:46:55.207 INFO SparkContext - Created broadcast 529 from broadcast at DAGScheduler.scala:1580
18:46:55.207 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
18:46:55.207 INFO TaskSchedulerImpl - Adding task set 260.0 with 1 tasks resource profile 0
18:46:55.208 INFO TaskSetManager - Starting task 0.0 in stage 260.0 (TID 316) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
18:46:55.208 INFO Executor - Running task 0.0 in stage 260.0 (TID 316)
18:46:55.238 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:55.252 INFO Executor - Finished task 0.0 in stage 260.0 (TID 316). 1148 bytes result sent to driver
18:46:55.253 INFO TaskSetManager - Finished task 0.0 in stage 260.0 (TID 316) in 45 ms on localhost (executor driver) (1/1)
18:46:55.253 INFO TaskSchedulerImpl - Removed TaskSet 260.0, whose tasks have all completed, from pool
18:46:55.253 INFO DAGScheduler - ShuffleMapStage 260 (mapToPair at SparkUtils.java:161) finished in 0.064 s
18:46:55.253 INFO DAGScheduler - looking for newly runnable stages
18:46:55.253 INFO DAGScheduler - running: HashSet()
18:46:55.253 INFO DAGScheduler - waiting: HashSet(ResultStage 261)
18:46:55.253 INFO DAGScheduler - failed: HashSet()
18:46:55.253 INFO DAGScheduler - Submitting ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91), which has no missing parents
18:46:55.260 INFO MemoryStore - Block broadcast_530 stored as values in memory (estimated size 241.4 KiB, free 1916.7 MiB)
18:46:55.261 INFO MemoryStore - Block broadcast_530_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.6 MiB)
18:46:55.261 INFO BlockManagerInfo - Added broadcast_530_piece0 in memory on localhost:45727 (size: 67.0 KiB, free: 1919.4 MiB)
18:46:55.261 INFO SparkContext - Created broadcast 530 from broadcast at DAGScheduler.scala:1580
18:46:55.261 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
18:46:55.261 INFO TaskSchedulerImpl - Adding task set 261.0 with 1 tasks resource profile 0
18:46:55.262 INFO TaskSetManager - Starting task 0.0 in stage 261.0 (TID 317) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
18:46:55.262 INFO Executor - Running task 0.0 in stage 261.0 (TID 317)
18:46:55.267 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
18:46:55.267 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
18:46:55.279 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:55.279 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:55.279 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:55.279 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
18:46:55.279 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
18:46:55.279 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
18:46:55.301 INFO FileOutputCommitter - Saved output of task 'attempt_202505191846551382813991263836459_1258_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest116302892682405950549.bam.parts/_temporary/0/task_202505191846551382813991263836459_1258_r_000000
18:46:55.301 INFO SparkHadoopMapRedUtil - attempt_202505191846551382813991263836459_1258_r_000000_0: Committed. Elapsed time: 0 ms.
18:46:55.302 INFO Executor - Finished task 0.0 in stage 261.0 (TID 317). 1858 bytes result sent to driver
18:46:55.302 INFO TaskSetManager - Finished task 0.0 in stage 261.0 (TID 317) in 41 ms on localhost (executor driver) (1/1)
18:46:55.302 INFO TaskSchedulerImpl - Removed TaskSet 261.0, whose tasks have all completed, from pool
18:46:55.302 INFO DAGScheduler - ResultStage 261 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
18:46:55.302 INFO DAGScheduler - Job 197 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:55.302 INFO TaskSchedulerImpl - Killing all running tasks in stage 261: Stage finished
18:46:55.302 INFO DAGScheduler - Job 197 finished: runJob at SparkHadoopWriter.scala:83, took 0.114459 s
18:46:55.302 INFO SparkHadoopWriter - Start to commit write Job job_202505191846551382813991263836459_1258.
18:46:55.307 INFO SparkHadoopWriter - Write Job job_202505191846551382813991263836459_1258 committed. Elapsed time: 4 ms.
18:46:55.318 INFO HadoopFileSystemWrapper - Concatenating 3 parts to file:////tmp/ReadsSparkSinkUnitTest116302892682405950549.bam
18:46:55.322 INFO HadoopFileSystemWrapper - Concatenating to file:////tmp/ReadsSparkSinkUnitTest116302892682405950549.bam done
18:46:55.322 INFO IndexFileMerger - Merging .sbi files in temp directory file:////tmp/ReadsSparkSinkUnitTest116302892682405950549.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest116302892682405950549.bam.sbi
18:46:55.326 INFO IndexFileMerger - Done merging .sbi files
18:46:55.326 INFO IndexFileMerger - Merging .bai files in temp directory file:////tmp/ReadsSparkSinkUnitTest116302892682405950549.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest116302892682405950549.bam.bai
18:46:55.331 INFO IndexFileMerger - Done merging .bai files
18:46:55.333 INFO MemoryStore - Block broadcast_531 stored as values in memory (estimated size 320.0 B, free 1916.6 MiB)
18:46:55.333 INFO MemoryStore - Block broadcast_531_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.6 MiB)
18:46:55.333 INFO BlockManagerInfo - Added broadcast_531_piece0 in memory on localhost:45727 (size: 233.0 B, free: 1919.4 MiB)
18:46:55.334 INFO SparkContext - Created broadcast 531 from broadcast at BamSource.java:104
18:46:55.334 INFO MemoryStore - Block broadcast_532 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
18:46:55.340 INFO MemoryStore - Block broadcast_532_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.2 MiB)
18:46:55.340 INFO BlockManagerInfo - Added broadcast_532_piece0 in memory on localhost:45727 (size: 50.2 KiB, free: 1919.3 MiB)
18:46:55.341 INFO SparkContext - Created broadcast 532 from newAPIHadoopFile at PathSplitSource.java:96
18:46:55.349 INFO FileInputFormat - Total input files to process : 1
18:46:55.363 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
18:46:55.364 INFO DAGScheduler - Got job 198 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
18:46:55.364 INFO DAGScheduler - Final stage: ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182)
18:46:55.364 INFO DAGScheduler - Parents of final stage: List()
18:46:55.364 INFO DAGScheduler - Missing parents: List()
18:46:55.364 INFO DAGScheduler - Submitting ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:55.370 INFO MemoryStore - Block broadcast_533 stored as values in memory (estimated size 148.2 KiB, free 1916.1 MiB)
18:46:55.371 INFO MemoryStore - Block broadcast_533_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1916.1 MiB)
18:46:55.371 INFO BlockManagerInfo - Added broadcast_533_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.3 MiB)
18:46:55.371 INFO SparkContext - Created broadcast 533 from broadcast at DAGScheduler.scala:1580
18:46:55.371 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:55.371 INFO TaskSchedulerImpl - Adding task set 262.0 with 1 tasks resource profile 0
18:46:55.371 INFO TaskSetManager - Starting task 0.0 in stage 262.0 (TID 318) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:55.372 INFO Executor - Running task 0.0 in stage 262.0 (TID 318)
18:46:55.383 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest116302892682405950549.bam:0+237038
18:46:55.387 INFO Executor - Finished task 0.0 in stage 262.0 (TID 318). 651483 bytes result sent to driver
18:46:55.388 INFO TaskSetManager - Finished task 0.0 in stage 262.0 (TID 318) in 17 ms on localhost (executor driver) (1/1)
18:46:55.388 INFO TaskSchedulerImpl - Removed TaskSet 262.0, whose tasks have all completed, from pool
18:46:55.388 INFO DAGScheduler - ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.024 s
18:46:55.388 INFO DAGScheduler - Job 198 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:55.388 INFO TaskSchedulerImpl - Killing all running tasks in stage 262: Stage finished
18:46:55.388 INFO DAGScheduler - Job 198 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025020 s
18:46:55.398 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:55.398 INFO DAGScheduler - Got job 199 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:55.398 INFO DAGScheduler - Final stage: ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185)
18:46:55.398 INFO DAGScheduler - Parents of final stage: List()
18:46:55.398 INFO DAGScheduler - Missing parents: List()
18:46:55.398 INFO DAGScheduler - Submitting ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:55.415 INFO MemoryStore - Block broadcast_534 stored as values in memory (estimated size 426.1 KiB, free 1915.6 MiB)
18:46:55.416 INFO MemoryStore - Block broadcast_534_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.5 MiB)
18:46:55.416 INFO BlockManagerInfo - Added broadcast_534_piece0 in memory on localhost:45727 (size: 153.6 KiB, free: 1919.1 MiB)
18:46:55.416 INFO SparkContext - Created broadcast 534 from broadcast at DAGScheduler.scala:1580
18:46:55.416 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:55.416 INFO TaskSchedulerImpl - Adding task set 263.0 with 1 tasks resource profile 0
18:46:55.417 INFO TaskSetManager - Starting task 0.0 in stage 263.0 (TID 319) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
18:46:55.417 INFO Executor - Running task 0.0 in stage 263.0 (TID 319)
18:46:55.446 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
18:46:55.455 INFO Executor - Finished task 0.0 in stage 263.0 (TID 319). 989 bytes result sent to driver
18:46:55.455 INFO TaskSetManager - Finished task 0.0 in stage 263.0 (TID 319) in 38 ms on localhost (executor driver) (1/1)
18:46:55.455 INFO TaskSchedulerImpl - Removed TaskSet 263.0, whose tasks have all completed, from pool
18:46:55.455 INFO DAGScheduler - ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
18:46:55.455 INFO DAGScheduler - Job 199 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:55.455 INFO TaskSchedulerImpl - Killing all running tasks in stage 263: Stage finished
18:46:55.455 INFO DAGScheduler - Job 199 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057572 s
18:46:55.458 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
18:46:55.459 INFO DAGScheduler - Got job 200 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
18:46:55.459 INFO DAGScheduler - Final stage: ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185)
18:46:55.459 INFO DAGScheduler - Parents of final stage: List()
18:46:55.459 INFO DAGScheduler - Missing parents: List()
18:46:55.459 INFO DAGScheduler - Submitting ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
18:46:55.465 INFO MemoryStore - Block broadcast_535 stored as values in memory (estimated size 148.1 KiB, free 1915.3 MiB)
18:46:55.466 INFO MemoryStore - Block broadcast_535_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.3 MiB)
18:46:55.466 INFO BlockManagerInfo - Added broadcast_535_piece0 in memory on localhost:45727 (size: 54.5 KiB, free: 1919.1 MiB)
18:46:55.466 INFO SparkContext - Created broadcast 535 from broadcast at DAGScheduler.scala:1580
18:46:55.466 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
18:46:55.466 INFO TaskSchedulerImpl - Adding task set 264.0 with 1 tasks resource profile 0
18:46:55.466 INFO TaskSetManager - Starting task 0.0 in stage 264.0 (TID 320) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
18:46:55.467 INFO Executor - Running task 0.0 in stage 264.0 (TID 320)
18:46:55.477 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest116302892682405950549.bam:0+237038
18:46:55.485 INFO Executor - Finished task 0.0 in stage 264.0 (TID 320). 1075 bytes result sent to driver
18:46:55.485 INFO BlockManagerInfo - Removed broadcast_529_piece0 on localhost:45727 in memory (size: 166.1 KiB, free: 1919.3 MiB)
18:46:55.485 INFO TaskSetManager - Finished task 0.0 in stage 264.0 (TID 320) in 19 ms on localhost (executor driver) (1/1)
18:46:55.485 INFO TaskSchedulerImpl - Removed TaskSet 264.0, whose tasks have all completed, from pool
18:46:55.485 INFO BlockManagerInfo - Removed broadcast_520_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)
18:46:55.485 INFO DAGScheduler - ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
18:46:55.485 INFO DAGScheduler - Job 200 is finished. Cancelling potential speculative or zombie tasks for this job
18:46:55.485 INFO TaskSchedulerImpl - Killing all running tasks in stage 264: Stage finished
18:46:55.485 INFO DAGScheduler - Job 200 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026985 s
18:46:55.486 INFO BlockManagerInfo - Removed broadcast_527_piece0 on localhost:45727 in memory (size: 9.6 KiB, free: 1919.3 MiB)