19:48:14.857 INFO MiniDFSCluster - starting cluster: numNameNodes=1, numDataNodes=1
19:48:15.071 INFO NameNode - Formatting using clusterid: testClusterID
19:48:15.080 INFO FSEditLog - Edit logging is async:true
19:48:15.094 INFO FSNamesystem - KeyProvider: null
19:48:15.095 INFO FSNamesystem - fsLock is fair: true
19:48:15.095 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
19:48:15.096 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
19:48:15.096 INFO FSNamesystem - supergroup = supergroup
19:48:15.096 INFO FSNamesystem - isPermissionEnabled = true
19:48:15.096 INFO FSNamesystem - isStoragePolicyEnabled = true
19:48:15.096 INFO FSNamesystem - HA Enabled: false
19:48:15.123 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
19:48:15.127 INFO deprecation - hadoop.configured.node.mapping is deprecated. Instead, use net.topology.configured.node.mapping
19:48:15.127 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
19:48:15.127 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
19:48:15.129 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
19:48:15.129 INFO BlockManager - The block deletion will start around 2025 Jul 15 19:48:15
19:48:15.130 INFO GSet - Computing capacity for map BlocksMap
19:48:15.130 INFO GSet - VM type = 64-bit
19:48:15.131 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
19:48:15.131 INFO GSet - capacity = 2^23 = 8388608 entries
19:48:15.134 INFO BlockManager - Storage policy satisfier is disabled
19:48:15.134 INFO BlockManager - dfs.block.access.token.enable = false
19:48:15.138 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
19:48:15.138 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
19:48:15.138 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
19:48:15.138 INFO BlockManager - defaultReplication = 1
19:48:15.138 INFO BlockManager - maxReplication = 512
19:48:15.138 INFO BlockManager - minReplication = 1
19:48:15.138 INFO BlockManager - maxReplicationStreams = 2
19:48:15.138 INFO BlockManager - redundancyRecheckInterval = 3000ms
19:48:15.138 INFO BlockManager - encryptDataTransfer = false
19:48:15.138 INFO BlockManager - maxNumBlocksToLog = 1000
19:48:15.154 INFO FSDirectory - GLOBAL serial map: bits=29 maxEntries=536870911
19:48:15.154 INFO FSDirectory - USER serial map: bits=24 maxEntries=16777215
19:48:15.154 INFO FSDirectory - GROUP serial map: bits=24 maxEntries=16777215
19:48:15.154 INFO FSDirectory - XATTR serial map: bits=24 maxEntries=16777215
19:48:15.160 INFO GSet - Computing capacity for map INodeMap
19:48:15.160 INFO GSet - VM type = 64-bit
19:48:15.161 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
19:48:15.161 INFO GSet - capacity = 2^22 = 4194304 entries
19:48:15.162 INFO FSDirectory - ACLs enabled? true
19:48:15.162 INFO FSDirectory - POSIX ACL inheritance enabled? true
19:48:15.162 INFO FSDirectory - XAttrs enabled? true
19:48:15.162 INFO NameNode - Caching file names occurring more than 10 times
19:48:15.165 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
19:48:15.165 INFO SnapshotManager - SkipList is disabled
19:48:15.168 INFO GSet - Computing capacity for map cachedBlocks
19:48:15.168 INFO GSet - VM type = 64-bit
19:48:15.168 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
19:48:15.168 INFO GSet - capacity = 2^20 = 1048576 entries
19:48:15.173 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
19:48:15.173 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
19:48:15.173 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
19:48:15.175 INFO FSNamesystem - Retry cache on namenode is enabled
19:48:15.175 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
19:48:15.176 INFO GSet - Computing capacity for map NameNodeRetryCache
19:48:15.176 INFO GSet - VM type = 64-bit
19:48:15.176 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
19:48:15.176 INFO GSet - capacity = 2^17 = 131072 entries
19:48:15.186 INFO FSImage - Allocated new BlockPoolId: BP-1160364076-10.1.0.127-1752608895182
19:48:15.192 INFO Storage - Storage directory /tmp/minicluster_storage10689261495343833868/name-0-1 has been successfully formatted.
19:48:15.193 INFO Storage - Storage directory /tmp/minicluster_storage10689261495343833868/name-0-2 has been successfully formatted.
19:48:15.207 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage10689261495343833868/name-0-1/current/fsimage.ckpt_0000000000000000000 using no compression
19:48:15.207 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage10689261495343833868/name-0-2/current/fsimage.ckpt_0000000000000000000 using no compression
19:48:15.325 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage10689261495343833868/name-0-1/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
19:48:15.325 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage10689261495343833868/name-0-2/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
19:48:15.338 INFO NNStorageRetentionManager - Going to retain 1 images with txid >= 0
19:48:15.364 INFO FSNamesystem - Stopping services started for active state
19:48:15.364 INFO FSNamesystem - Stopping services started for standby state
19:48:15.365 INFO NameNode - createNameNode []
19:48:15.398 WARN MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
19:48:15.405 INFO MetricsSystemImpl - Scheduled Metric snapshot period at 10 second(s).
19:48:15.405 INFO MetricsSystemImpl - NameNode metrics system started
19:48:15.409 INFO NameNodeUtils - fs.defaultFS is hdfs://127.0.0.1:0
19:48:15.430 INFO JvmPauseMonitor - Starting JVM pause monitor
19:48:15.440 INFO DFSUtil - Filter initializers set : org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
19:48:15.444 INFO DFSUtil - Starting Web-server for hdfs at: http://localhost:0
19:48:15.452 INFO log - Logging initialized @26664ms to org.eclipse.jetty.util.log.Slf4jLog
19:48:15.511 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
19:48:15.514 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
19:48:15.517 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
19:48:15.518 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
19:48:15.518 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
19:48:15.520 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context hdfs
19:48:15.520 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context static
19:48:15.545 INFO HttpServer2 - addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
19:48:15.548 INFO HttpServer2 - Jetty bound to port 46527
19:48:15.549 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
19:48:15.570 INFO session - DefaultSessionIdManager workerName=node0
19:48:15.570 INFO session - No SessionScavenger set, using defaults
19:48:15.571 INFO session - node0 Scavenging every 660000ms
19:48:15.581 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
19:48:15.583 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@6cf4325b{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
19:48:15.718 INFO ContextHandler - Started o.e.j.w.WebAppContext@15d0d8c{hdfs,/,file:///tmp/jetty-localhost-46527-hadoop-hdfs-3_3_6-tests_jar-_-any-17710510227434783723/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/hdfs}
19:48:15.722 INFO AbstractConnector - Started ServerConnector@52889c68{HTTP/1.1, (http/1.1)}{localhost:46527}
19:48:15.722 INFO Server - Started @26935ms
19:48:15.726 INFO FSEditLog - Edit logging is async:true
19:48:15.736 INFO FSNamesystem - KeyProvider: null
19:48:15.736 INFO FSNamesystem - fsLock is fair: true
19:48:15.736 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
19:48:15.736 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
19:48:15.736 INFO FSNamesystem - supergroup = supergroup
19:48:15.737 INFO FSNamesystem - isPermissionEnabled = true
19:48:15.737 INFO FSNamesystem - isStoragePolicyEnabled = true
19:48:15.737 INFO FSNamesystem - HA Enabled: false
19:48:15.737 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
19:48:15.737 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
19:48:15.737 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
19:48:15.737 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
19:48:15.738 INFO BlockManager - The block deletion will start around 2025 Jul 15 19:48:15
19:48:15.738 INFO GSet - Computing capacity for map BlocksMap
19:48:15.738 INFO GSet - VM type = 64-bit
19:48:15.738 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
19:48:15.738 INFO GSet - capacity = 2^23 = 8388608 entries
19:48:15.754 INFO BlockManagerInfo - Removed broadcast_30_piece0 on localhost:36125 in memory (size: 4.7 KiB, free: 1920.0 MiB)
19:48:15.755 INFO BlockManager - Storage policy satisfier is disabled
19:48:15.755 INFO BlockManager - dfs.block.access.token.enable = false
19:48:15.755 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
19:48:15.755 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
19:48:15.755 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
19:48:15.755 INFO BlockManager - defaultReplication = 1
19:48:15.756 INFO BlockManager - maxReplication = 512
19:48:15.756 INFO BlockManager - minReplication = 1
19:48:15.756 INFO BlockManager - maxReplicationStreams = 2
19:48:15.756 INFO BlockManager - redundancyRecheckInterval = 3000ms
19:48:15.756 INFO BlockManager - encryptDataTransfer = false
19:48:15.756 INFO BlockManager - maxNumBlocksToLog = 1000
19:48:15.756 INFO GSet - Computing capacity for map INodeMap
19:48:15.756 INFO GSet - VM type = 64-bit
19:48:15.756 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
19:48:15.756 INFO GSet - capacity = 2^22 = 4194304 entries
19:48:15.757 INFO BlockManagerInfo - Removed broadcast_33_piece0 on localhost:36125 in memory (size: 4.8 KiB, free: 1920.0 MiB)
19:48:15.758 INFO BlockManagerInfo - Removed broadcast_32_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1920.0 MiB)
19:48:15.759 INFO FSDirectory - ACLs enabled? true
19:48:15.759 INFO FSDirectory - POSIX ACL inheritance enabled? true
19:48:15.759 INFO FSDirectory - XAttrs enabled? true
19:48:15.759 INFO NameNode - Caching file names occurring more than 10 times
19:48:15.759 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
19:48:15.759 INFO SnapshotManager - SkipList is disabled
19:48:15.759 INFO GSet - Computing capacity for map cachedBlocks
19:48:15.759 INFO GSet - VM type = 64-bit
19:48:15.759 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
19:48:15.759 INFO GSet - capacity = 2^20 = 1048576 entries
19:48:15.760 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
19:48:15.760 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
19:48:15.760 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
19:48:15.760 INFO FSNamesystem - Retry cache on namenode is enabled
19:48:15.760 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
19:48:15.760 INFO GSet - Computing capacity for map NameNodeRetryCache
19:48:15.760 INFO GSet - VM type = 64-bit
19:48:15.761 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
19:48:15.761 INFO GSet - capacity = 2^17 = 131072 entries
19:48:15.762 INFO BlockManagerInfo - Removed broadcast_29_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1920.0 MiB)
19:48:15.764 INFO BlockManagerInfo - Removed broadcast_22_piece0 on localhost:36125 in memory (size: 159.0 B, free: 1920.0 MiB)
19:48:15.765 INFO Storage - Lock on /tmp/minicluster_storage10689261495343833868/name-0-1/in_use.lock acquired by nodename 3162@pkrvmq0rgcvqdmg
19:48:15.767 INFO Storage - Lock on /tmp/minicluster_storage10689261495343833868/name-0-2/in_use.lock acquired by nodename 3162@pkrvmq0rgcvqdmg
19:48:15.768 INFO BlockManagerInfo - Removed broadcast_23_piece0 on localhost:36125 in memory (size: 465.0 B, free: 1920.0 MiB)
19:48:15.768 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage10689261495343833868/name-0-1/current
19:48:15.768 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage10689261495343833868/name-0-2/current
19:48:15.768 INFO FSImage - No edit log streams selected.
19:48:15.768 INFO FSImage - Planning to load image: FSImageFile(file=/tmp/minicluster_storage10689261495343833868/name-0-1/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
19:48:15.770 INFO BlockManagerInfo - Removed broadcast_26_piece0 on localhost:36125 in memory (size: 3.2 KiB, free: 1920.0 MiB)
19:48:15.772 INFO BlockManagerInfo - Removed broadcast_27_piece0 on localhost:36125 in memory (size: 5.1 KiB, free: 1920.0 MiB)
19:48:15.774 INFO BlockManagerInfo - Removed broadcast_24_piece0 on localhost:36125 in memory (size: 4.3 KiB, free: 1920.0 MiB)
19:48:15.776 INFO BlockManagerInfo - Removed broadcast_25_piece0 on localhost:36125 in memory (size: 4.5 KiB, free: 1920.0 MiB)
19:48:15.777 INFO BlockManagerInfo - Removed broadcast_31_piece0 on localhost:36125 in memory (size: 320.0 B, free: 1920.0 MiB)
19:48:15.778 INFO BlockManagerInfo - Removed broadcast_28_piece0 on localhost:36125 in memory (size: 320.0 B, free: 1920.0 MiB)
19:48:15.779 INFO BlockManager - Removing RDD 47
19:48:15.786 INFO FSImageFormatPBINode - Loading 1 INodes.
19:48:15.787 INFO FSImageFormatPBINode - Successfully loaded 1 inodes
19:48:15.789 INFO FSImageFormatPBINode - Completed update blocks map and name cache, total waiting duration 0ms.
19:48:15.790 INFO FSImageFormatProtobuf - Loaded FSImage in 0 seconds.
19:48:15.790 INFO FSImage - Loaded image for txid 0 from /tmp/minicluster_storage10689261495343833868/name-0-1/current/fsimage_0000000000000000000
19:48:15.793 INFO FSNamesystem - Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
19:48:15.793 INFO FSEditLog - Starting log segment at 1
19:48:15.801 INFO NameCache - initialized with 0 entries 0 lookups
19:48:15.801 INFO FSNamesystem - Finished loading FSImage in 39 msecs
19:48:15.862 INFO NameNode - RPC server is binding to localhost:0
19:48:15.862 INFO NameNode - Enable NameNode state context:false
19:48:15.865 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
19:48:15.872 INFO Server - Listener at localhost:41235
19:48:15.872 INFO Server - Starting Socket Reader #1 for port 0
19:48:15.893 INFO NameNode - Clients are to use localhost:41235 to access this namenode/service.
19:48:15.894 INFO FSNamesystem - Registered FSNamesystemState, ReplicatedBlocksState and ECBlockGroupsState MBeans.
19:48:15.907 INFO LeaseManager - Number of blocks under construction: 0
19:48:15.911 INFO DatanodeAdminDefaultMonitor - Initialized the Default Decommission and Maintenance monitor
19:48:15.912 INFO BlockManager - Start MarkedDeleteBlockScrubber thread
19:48:15.913 INFO BlockManager - initializing replication queues
19:48:15.913 INFO StateChange - STATE* Leaving safe mode after 0 secs
19:48:15.913 INFO StateChange - STATE* Network topology has 0 racks and 0 datanodes
19:48:15.914 INFO StateChange - STATE* UnderReplicatedBlocks has 0 blocks
19:48:15.918 INFO BlockManager - Total number of blocks = 0
19:48:15.918 INFO BlockManager - Number of invalid blocks = 0
19:48:15.918 INFO BlockManager - Number of under-replicated blocks = 0
19:48:15.918 INFO BlockManager - Number of over-replicated blocks = 0
19:48:15.918 INFO BlockManager - Number of blocks being written = 0
19:48:15.918 INFO StateChange - STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 5 msec
19:48:15.928 INFO Server - IPC Server Responder: starting
19:48:15.928 INFO Server - IPC Server listener on 0: starting
19:48:15.930 INFO NameNode - NameNode RPC up at: localhost/127.0.0.1:41235
19:48:15.930 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
19:48:15.931 INFO FSNamesystem - Starting services required for active state
19:48:15.931 INFO FSDirectory - Initializing quota with 12 thread(s)
19:48:15.933 INFO FSDirectory - Quota initialization completed in 2 milliseconds
name space=1
storage space=0
storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0, PROVIDED=0
19:48:15.935 INFO CacheReplicationMonitor - Starting CacheReplicationMonitor with interval 30000 milliseconds
19:48:15.941 INFO MiniDFSCluster - Starting DataNode 0 with dfs.datanode.data.dir: [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data1,[DISK]file:/tmp/minicluster_storage10689261495343833868/data/data2
19:48:15.950 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data1
19:48:15.957 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data2
19:48:15.967 INFO MetricsSystemImpl - DataNode metrics system started (again)
19:48:15.972 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
19:48:15.975 INFO BlockScanner - Initialized block scanner with targetBytesPerSec 1048576
19:48:15.978 INFO DataNode - Configured hostname is 127.0.0.1
19:48:15.978 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
19:48:15.979 INFO DataNode - Starting DataNode with maxLockedMemory = 0
19:48:15.982 INFO DataNode - Opened streaming server at /127.0.0.1:45925
19:48:15.984 INFO DataNode - Balancing bandwidth is 104857600 bytes/s
19:48:15.984 INFO DataNode - Number threads for balancing is 100
19:48:15.988 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
19:48:15.989 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
19:48:15.990 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
19:48:15.990 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
19:48:15.990 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
19:48:15.992 INFO HttpServer2 - Jetty bound to port 38783
19:48:15.992 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
19:48:15.993 INFO session - DefaultSessionIdManager workerName=node0
19:48:15.993 INFO session - No SessionScavenger set, using defaults
19:48:15.993 INFO session - node0 Scavenging every 600000ms
19:48:15.993 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@68b4e651{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
19:48:16.096 INFO ContextHandler - Started o.e.j.w.WebAppContext@52214284{datanode,/,file:///tmp/jetty-localhost-38783-hadoop-hdfs-3_3_6-tests_jar-_-any-5989618901507383294/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/datanode}
19:48:16.097 INFO AbstractConnector - Started ServerConnector@43ebc9b5{HTTP/1.1, (http/1.1)}{localhost:38783}
19:48:16.097 INFO Server - Started @27310ms
19:48:16.101 WARN DatanodeHttpServer - Got null for restCsrfPreventionFilter - will not do any filtering.
19:48:16.102 INFO DatanodeHttpServer - Listening HTTP traffic on /127.0.0.1:36579
19:48:16.102 INFO JvmPauseMonitor - Starting JVM pause monitor
19:48:16.103 INFO DataNode - dnUserName = runner
19:48:16.103 INFO DataNode - supergroup = supergroup
19:48:16.110 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
19:48:16.111 INFO Server - Listener at localhost:39727
19:48:16.111 INFO Server - Starting Socket Reader #1 for port 0
19:48:16.114 INFO DataNode - Opened IPC server at /127.0.0.1:39727
19:48:16.129 INFO DataNode - Refresh request received for nameservices: null
19:48:16.130 INFO DataNode - Starting BPOfferServices for nameservices: <default>
19:48:16.135 INFO DataNode - Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:41235 starting to offer service
19:48:16.137 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
19:48:16.138 INFO Server - IPC Server Responder: starting
19:48:16.138 INFO Server - IPC Server listener on 0: starting
19:48:16.237 INFO DataNode - Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:41235
19:48:16.239 INFO Storage - Using 2 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=2, dataDirs=2)
19:48:16.240 INFO Storage - Lock on /tmp/minicluster_storage10689261495343833868/data/data1/in_use.lock acquired by nodename 3162@pkrvmq0rgcvqdmg
19:48:16.240 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data1 is not formatted for namespace 2041704794. Formatting...
19:48:16.240 INFO Storage - Generated new storageID DS-ddb3bdbc-88ca-4593-b0c0-6a38066d0179 for directory /tmp/minicluster_storage10689261495343833868/data/data1
19:48:16.244 INFO Storage - Lock on /tmp/minicluster_storage10689261495343833868/data/data2/in_use.lock acquired by nodename 3162@pkrvmq0rgcvqdmg
19:48:16.245 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data2 is not formatted for namespace 2041704794. Formatting...
19:48:16.245 INFO Storage - Generated new storageID DS-86859461-1cbb-4658-a168-5255c1de684d for directory /tmp/minicluster_storage10689261495343833868/data/data2
19:48:16.246 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
19:48:16.251 INFO MiniDFSCluster - dnInfo.length != numDataNodes
19:48:16.251 INFO MiniDFSCluster - Waiting for cluster to become active
19:48:16.264 INFO Storage - Analyzing storage directories for bpid BP-1160364076-10.1.0.127-1752608895182
19:48:16.264 INFO Storage - Locking is disabled for /tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182
19:48:16.264 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data1 and block pool id BP-1160364076-10.1.0.127-1752608895182 is not formatted. Formatting ...
19:48:16.264 INFO Storage - Formatting block pool BP-1160364076-10.1.0.127-1752608895182 directory /tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current
19:48:16.283 INFO Storage - Analyzing storage directories for bpid BP-1160364076-10.1.0.127-1752608895182
19:48:16.283 INFO Storage - Locking is disabled for /tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182
19:48:16.283 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data2 and block pool id BP-1160364076-10.1.0.127-1752608895182 is not formatted. Formatting ...
19:48:16.283 INFO Storage - Formatting block pool BP-1160364076-10.1.0.127-1752608895182 directory /tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current
19:48:16.284 INFO DataNode - Setting up storage: nsid=2041704794;bpid=BP-1160364076-10.1.0.127-1752608895182;lv=-57;nsInfo=lv=-66;cid=testClusterID;nsid=2041704794;c=1752608895182;bpid=BP-1160364076-10.1.0.127-1752608895182;dnuuid=null
19:48:16.285 INFO DataNode - Generated and persisted new Datanode UUID bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e
19:48:16.296 INFO FsDatasetImpl - The datanode lock is a read write lock
19:48:16.325 INFO FsDatasetImpl - Added new volume: DS-ddb3bdbc-88ca-4593-b0c0-6a38066d0179
19:48:16.325 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data1, StorageType: DISK
19:48:16.326 INFO FsDatasetImpl - Added new volume: DS-86859461-1cbb-4658-a168-5255c1de684d
19:48:16.326 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage10689261495343833868/data/data2, StorageType: DISK
19:48:16.329 INFO MemoryMappableBlockLoader - Initializing cache loader: MemoryMappableBlockLoader.
19:48:16.331 INFO FsDatasetImpl - Registered FSDatasetState MBean
19:48:16.334 INFO FsDatasetImpl - Adding block pool BP-1160364076-10.1.0.127-1752608895182
19:48:16.335 INFO FsDatasetImpl - Scanning block pool BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data1...
19:48:16.335 INFO FsDatasetImpl - Scanning block pool BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data2...
19:48:16.338 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current, will proceed with Du for space computation calculation,
19:48:16.338 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current, will proceed with Du for space computation calculation,
19:48:16.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
19:48:16.352 INFO MiniDFSCluster - dnInfo.length != numDataNodes
19:48:16.352 INFO MiniDFSCluster - Waiting for cluster to become active
19:48:16.367 INFO FsDatasetImpl - Time taken to scan block pool BP-1160364076-10.1.0.127-1752608895182 on /tmp/minicluster_storage10689261495343833868/data/data2: 32ms
19:48:16.372 INFO FsDatasetImpl - Time taken to scan block pool BP-1160364076-10.1.0.127-1752608895182 on /tmp/minicluster_storage10689261495343833868/data/data1: 37ms
19:48:16.372 INFO FsDatasetImpl - Total time to scan all replicas for block pool BP-1160364076-10.1.0.127-1752608895182: 37ms
19:48:16.373 INFO FsDatasetImpl - Adding replicas to map for block pool BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data1...
19:48:16.373 INFO FsDatasetImpl - Adding replicas to map for block pool BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data2...
19:48:16.373 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/replicas doesn't exist
19:48:16.373 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/replicas doesn't exist
19:48:16.374 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data2: 1ms
19:48:16.374 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data1: 1ms
19:48:16.374 INFO FsDatasetImpl - Total time to add all replicas to map for block pool BP-1160364076-10.1.0.127-1752608895182: 1ms
19:48:16.374 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage10689261495343833868/data/data1
19:48:16.378 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage10689261495343833868/data/data1
19:48:16.378 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage10689261495343833868/data/data2
19:48:16.378 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage10689261495343833868/data/data2
19:48:16.393 INFO VolumeScanner - Now scanning bpid BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data2
19:48:16.394 WARN DirectoryScanner - dfs.datanode.directoryscan.throttle.limit.ms.per.sec set to value above 1000 ms/sec. Assuming default value of -1
19:48:16.394 INFO DirectoryScanner - Periodic Directory Tree Verification scan starting in 17296767ms with interval of 21600000ms and throttle limit of -1ms/s
19:48:16.394 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10689261495343833868/data/data2, DS-86859461-1cbb-4658-a168-5255c1de684d): finished scanning block pool BP-1160364076-10.1.0.127-1752608895182
19:48:16.395 INFO VolumeScanner - Now scanning bpid BP-1160364076-10.1.0.127-1752608895182 on volume /tmp/minicluster_storage10689261495343833868/data/data1
19:48:16.395 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10689261495343833868/data/data1, DS-ddb3bdbc-88ca-4593-b0c0-6a38066d0179): finished scanning block pool BP-1160364076-10.1.0.127-1752608895182
19:48:16.398 INFO DataNode - Block pool BP-1160364076-10.1.0.127-1752608895182 (Datanode Uuid bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e) service to localhost/127.0.0.1:41235 beginning handshake with NN
19:48:16.399 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10689261495343833868/data/data1, DS-ddb3bdbc-88ca-4593-b0c0-6a38066d0179): no suitable block pools found to scan. Waiting 1814399993 ms.
19:48:16.400 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10689261495343833868/data/data2, DS-86859461-1cbb-4658-a168-5255c1de684d): no suitable block pools found to scan. Waiting 1814399992 ms.
19:48:16.409 INFO StateChange - BLOCK* registerDatanode: from DatanodeRegistration(127.0.0.1:45925, datanodeUuid=bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, infoPort=36579, infoSecurePort=0, ipcPort=39727, storageInfo=lv=-57;cid=testClusterID;nsid=2041704794;c=1752608895182) storage bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e
19:48:16.410 INFO NetworkTopology - Adding a new node: /default-rack/127.0.0.1:45925
19:48:16.410 INFO BlockReportLeaseManager - Registered DN bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e (127.0.0.1:45925).
19:48:16.413 INFO DataNode - Block pool BP-1160364076-10.1.0.127-1752608895182 (Datanode Uuid bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e) service to localhost/127.0.0.1:41235 successfully registered with NN
19:48:16.413 INFO DataNode - For namenode localhost/127.0.0.1:41235 using BLOCKREPORT_INTERVAL of 21600000msecs CACHEREPORT_INTERVAL of 10000msecs Initial delay: 0msecs; heartBeatInterval=3000
19:48:16.415 INFO DataNode - Starting IBR Task Handler.
19:48:16.422 INFO DatanodeDescriptor - Adding new storage ID DS-ddb3bdbc-88ca-4593-b0c0-6a38066d0179 for DN 127.0.0.1:45925
19:48:16.423 INFO DatanodeDescriptor - Adding new storage ID DS-86859461-1cbb-4658-a168-5255c1de684d for DN 127.0.0.1:45925
19:48:16.429 INFO DataNode - After receiving heartbeat response, updating state of namenode localhost:41235 to active
19:48:16.439 INFO BlockStateChange - BLOCK* processReport 0x8c3161441e290105 with lease ID 0x43ca7923fab20140: Processing first storage report for DS-ddb3bdbc-88ca-4593-b0c0-6a38066d0179 from datanode DatanodeRegistration(127.0.0.1:45925, datanodeUuid=bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, infoPort=36579, infoSecurePort=0, ipcPort=39727, storageInfo=lv=-57;cid=testClusterID;nsid=2041704794;c=1752608895182)
19:48:16.440 INFO BlockStateChange - BLOCK* processReport 0x8c3161441e290105 with lease ID 0x43ca7923fab20140: from storage DS-ddb3bdbc-88ca-4593-b0c0-6a38066d0179 node DatanodeRegistration(127.0.0.1:45925, datanodeUuid=bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, infoPort=36579, infoSecurePort=0, ipcPort=39727, storageInfo=lv=-57;cid=testClusterID;nsid=2041704794;c=1752608895182), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
19:48:16.440 INFO BlockStateChange - BLOCK* processReport 0x8c3161441e290105 with lease ID 0x43ca7923fab20140: Processing first storage report for DS-86859461-1cbb-4658-a168-5255c1de684d from datanode DatanodeRegistration(127.0.0.1:45925, datanodeUuid=bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, infoPort=36579, infoSecurePort=0, ipcPort=39727, storageInfo=lv=-57;cid=testClusterID;nsid=2041704794;c=1752608895182)
19:48:16.440 INFO BlockStateChange - BLOCK* processReport 0x8c3161441e290105 with lease ID 0x43ca7923fab20140: from storage DS-86859461-1cbb-4658-a168-5255c1de684d node DatanodeRegistration(127.0.0.1:45925, datanodeUuid=bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, infoPort=36579, infoSecurePort=0, ipcPort=39727, storageInfo=lv=-57;cid=testClusterID;nsid=2041704794;c=1752608895182), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
19:48:16.450 INFO DataNode - Successfully sent block report 0x8c3161441e290105 with lease ID 0x43ca7923fab20140 to namenode: localhost/127.0.0.1:41235, containing 2 storage report(s), of which we sent 2. The reports had 0 total blocks and used 1 RPC(s). This took 2 msecs to generate and 19 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
19:48:16.450 INFO DataNode - Got finalize command for block pool BP-1160364076-10.1.0.127-1752608895182
19:48:16.454 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
19:48:16.458 INFO MiniDFSCluster - Cluster is active
19:48:16.529 INFO MemoryStore - Block broadcast_34 stored as values in memory (estimated size 297.9 KiB, free 1919.7 MiB)
19:48:16.550 INFO MemoryStore - Block broadcast_34_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.7 MiB)
19:48:16.550 INFO BlockManagerInfo - Added broadcast_34_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1920.0 MiB)
19:48:16.551 INFO SparkContext - Created broadcast 34 from newAPIHadoopFile at PathSplitSource.java:96
19:48:16.599 INFO MemoryStore - Block broadcast_35 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
19:48:16.608 INFO MemoryStore - Block broadcast_35_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:16.608 INFO BlockManagerInfo - Added broadcast_35_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:16.608 INFO SparkContext - Created broadcast 35 from newAPIHadoopFile at PathSplitSource.java:96
19:48:16.655 INFO FileInputFormat - Total input files to process : 1
19:48:16.669 INFO MemoryStore - Block broadcast_36 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
19:48:16.673 INFO MemoryStore - Block broadcast_36_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
19:48:16.673 INFO BlockManagerInfo - Added broadcast_36_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.9 MiB)
19:48:16.674 INFO SparkContext - Created broadcast 36 from broadcast at ReadsSparkSink.java:133
19:48:16.683 INFO MemoryStore - Block broadcast_37 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
19:48:16.689 INFO MemoryStore - Block broadcast_37_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
19:48:16.690 INFO BlockManagerInfo - Added broadcast_37_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.9 MiB)
19:48:16.690 INFO SparkContext - Created broadcast 37 from broadcast at BamSink.java:76
19:48:16.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts dst=null perm=null proto=rpc
19:48:16.713 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:16.713 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:16.713 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:16.721 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:16.752 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:16.753 INFO DAGScheduler - Registering RDD 77 (mapToPair at SparkUtils.java:161) as input to shuffle 7
19:48:16.753 INFO DAGScheduler - Got job 20 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:16.753 INFO DAGScheduler - Final stage: ResultStage 30 (runJob at SparkHadoopWriter.scala:83)
19:48:16.753 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 29)
19:48:16.753 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 29)
19:48:16.754 INFO DAGScheduler - Submitting ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:16.785 INFO MemoryStore - Block broadcast_38 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
19:48:16.788 INFO MemoryStore - Block broadcast_38_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
19:48:16.788 INFO BlockManagerInfo - Added broadcast_38_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.7 MiB)
19:48:16.788 INFO SparkContext - Created broadcast 38 from broadcast at DAGScheduler.scala:1580
19:48:16.788 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:16.789 INFO TaskSchedulerImpl - Adding task set 29.0 with 1 tasks resource profile 0
19:48:16.792 INFO TaskSetManager - Starting task 0.0 in stage 29.0 (TID 67) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:16.793 INFO Executor - Running task 0.0 in stage 29.0 (TID 67)
19:48:16.859 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:16.911 INFO Executor - Finished task 0.0 in stage 29.0 (TID 67). 1148 bytes result sent to driver
19:48:16.912 INFO TaskSetManager - Finished task 0.0 in stage 29.0 (TID 67) in 123 ms on localhost (executor driver) (1/1)
19:48:16.912 INFO TaskSchedulerImpl - Removed TaskSet 29.0, whose tasks have all completed, from pool
19:48:16.912 INFO DAGScheduler - ShuffleMapStage 29 (mapToPair at SparkUtils.java:161) finished in 0.156 s
19:48:16.912 INFO DAGScheduler - looking for newly runnable stages
19:48:16.912 INFO DAGScheduler - running: HashSet()
19:48:16.912 INFO DAGScheduler - waiting: HashSet(ResultStage 30)
19:48:16.912 INFO DAGScheduler - failed: HashSet()
19:48:16.912 INFO DAGScheduler - Submitting ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91), which has no missing parents
19:48:16.923 INFO MemoryStore - Block broadcast_39 stored as values in memory (estimated size 241.5 KiB, free 1918.1 MiB)
19:48:16.924 INFO MemoryStore - Block broadcast_39_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.0 MiB)
19:48:16.924 INFO BlockManagerInfo - Added broadcast_39_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.7 MiB)
19:48:16.924 INFO SparkContext - Created broadcast 39 from broadcast at DAGScheduler.scala:1580
19:48:16.925 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:16.925 INFO TaskSchedulerImpl - Adding task set 30.0 with 1 tasks resource profile 0
19:48:16.925 INFO TaskSetManager - Starting task 0.0 in stage 30.0 (TID 68) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:16.926 INFO Executor - Running task 0.0 in stage 30.0 (TID 68)
19:48:16.942 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:16.942 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:17.040 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:17.040 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:17.040 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:17.040 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:17.040 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:17.040 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:17.061 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:17.078 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:17.080 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:17.100 INFO StateChange - BLOCK* allocate blk_1073741825_1001, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/part-r-00000
19:48:17.133 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741825_1001 src: /127.0.0.1:35576 dest: /127.0.0.1:45925
19:48:17.154 INFO clienttrace - src: /127.0.0.1:35576, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741825_1001, duration(ns): 4366432
19:48:17.154 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
19:48:17.158 INFO FSNamesystem - BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/part-r-00000
19:48:17.561 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:17.563 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:17.565 INFO StateChange - BLOCK* allocate blk_1073741826_1002, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/.part-r-00000.sbi
19:48:17.567 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741826_1002 src: /127.0.0.1:45148 dest: /127.0.0.1:45925
19:48:17.569 INFO clienttrace - src: /127.0.0.1:45148, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741826_1002, duration(ns): 761865
19:48:17.569 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
19:48:17.570 INFO FSNamesystem - BLOCK* blk_1073741826_1002 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/.part-r-00000.sbi
19:48:17.971 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:17.975 INFO StateChange - BLOCK* allocate blk_1073741827_1003, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/.part-r-00000.bai
19:48:17.976 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741827_1003 src: /127.0.0.1:45162 dest: /127.0.0.1:45925
19:48:17.978 INFO clienttrace - src: /127.0.0.1:45162, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741827_1003, duration(ns): 704520
19:48:17.978 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741827_1003, type=LAST_IN_PIPELINE terminating
19:48:17.980 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:17.983 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0 dst=null perm=null proto=rpc
19:48:17.987 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0 dst=null perm=null proto=rpc
19:48:17.988 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/task_202507151948161648898910171540859_0082_r_000000 dst=null perm=null proto=rpc
19:48:17.993 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/_temporary/attempt_202507151948161648898910171540859_0082_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/task_202507151948161648898910171540859_0082_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:17.994 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948161648898910171540859_0082_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/task_202507151948161648898910171540859_0082_r_000000
19:48:17.995 INFO SparkHadoopMapRedUtil - attempt_202507151948161648898910171540859_0082_r_000000_0: Committed. Elapsed time: 8 ms.
19:48:17.996 INFO Executor - Finished task 0.0 in stage 30.0 (TID 68). 1858 bytes result sent to driver
19:48:17.998 INFO TaskSetManager - Finished task 0.0 in stage 30.0 (TID 68) in 1073 ms on localhost (executor driver) (1/1)
19:48:17.998 INFO TaskSchedulerImpl - Removed TaskSet 30.0, whose tasks have all completed, from pool
19:48:17.998 INFO DAGScheduler - ResultStage 30 (runJob at SparkHadoopWriter.scala:83) finished in 1.085 s
19:48:17.998 INFO DAGScheduler - Job 20 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:17.998 INFO TaskSchedulerImpl - Killing all running tasks in stage 30: Stage finished
19:48:17.999 INFO DAGScheduler - Job 20 finished: runJob at SparkHadoopWriter.scala:83, took 1.246863 s
19:48:18.000 INFO SparkHadoopWriter - Start to commit write Job job_202507151948161648898910171540859_0082.
19:48:18.002 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:18.005 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts dst=null perm=null proto=rpc
19:48:18.005 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/task_202507151948161648898910171540859_0082_r_000000 dst=null perm=null proto=rpc
19:48:18.006 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:18.008 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/task_202507151948161648898910171540859_0082_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.008 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:18.009 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/task_202507151948161648898910171540859_0082_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.010 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:18.011 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary/0/task_202507151948161648898910171540859_0082_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.017 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:18.019 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.021 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:18.024 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.spark-staging-82 dst=null perm=null proto=rpc
19:48:18.024 INFO SparkHadoopWriter - Write Job job_202507151948161648898910171540859_0082 committed. Elapsed time: 23 ms.
19:48:18.025 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.028 INFO StateChange - BLOCK* allocate blk_1073741828_1004, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/header
19:48:18.030 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741828_1004 src: /127.0.0.1:45164 dest: /127.0.0.1:45925
19:48:18.031 INFO clienttrace - src: /127.0.0.1:45164, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741828_1004, duration(ns): 741860
19:48:18.031 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741828_1004, type=LAST_IN_PIPELINE terminating
19:48:18.033 INFO FSNamesystem - BLOCK* blk_1073741828_1004 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/header
19:48:18.434 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:18.436 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.437 INFO StateChange - BLOCK* allocate blk_1073741829_1005, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/terminator
19:48:18.439 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741829_1005 src: /127.0.0.1:45172 dest: /127.0.0.1:45925
19:48:18.440 INFO clienttrace - src: /127.0.0.1:45172, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741829_1005, duration(ns): 644027
19:48:18.440 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741829_1005, type=LAST_IN_PIPELINE terminating
19:48:18.441 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:18.442 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts dst=null perm=null proto=rpc
19:48:18.447 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.448 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:18.448 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam
19:48:18.451 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.452 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:18.454 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.454 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam done
19:48:18.454 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:18.456 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi
19:48:18.456 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts dst=null perm=null proto=rpc
19:48:18.458 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.460 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:18.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:18.492 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:18.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:18.495 INFO StateChange - BLOCK* allocate blk_1073741830_1006, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi
19:48:18.497 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741830_1006 src: /127.0.0.1:45192 dest: /127.0.0.1:45925
19:48:18.498 INFO clienttrace - src: /127.0.0.1:45192, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741830_1006, duration(ns): 688071
19:48:18.498 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741830_1006, type=LAST_IN_PIPELINE terminating
19:48:18.499 INFO FSNamesystem - BLOCK* blk_1073741830_1006 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi
19:48:18.900 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:18.901 INFO IndexFileMerger - Done merging .sbi files
19:48:18.902 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai
19:48:18.903 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts dst=null perm=null proto=rpc
19:48:18.904 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:18.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:18.907 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:18.910 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:18.911 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:18.922 INFO StateChange - BLOCK* allocate blk_1073741831_1007, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai
19:48:18.923 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741831_1007 src: /127.0.0.1:45208 dest: /127.0.0.1:45925
19:48:18.926 INFO clienttrace - src: /127.0.0.1:45208, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741831_1007, duration(ns): 1677483
19:48:18.926 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741831_1007, type=LAST_IN_PIPELINE terminating
19:48:18.928 INFO FSNamesystem - BLOCK* blk_1073741831_1007 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai
19:48:19.329 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:19.330 INFO IndexFileMerger - Done merging .bai files
19:48:19.330 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.parts dst=null perm=null proto=rpc
19:48:19.340 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi dst=null perm=null proto=rpc
19:48:19.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi dst=null perm=null proto=rpc
19:48:19.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi dst=null perm=null proto=rpc
19:48:19.353 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:19.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.356 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.356 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.358 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.359 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.360 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.362 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:19.366 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:19.367 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:19.368 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi dst=null perm=null proto=rpc
19:48:19.368 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi dst=null perm=null proto=rpc
19:48:19.369 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.sbi dst=null perm=null proto=rpc
19:48:19.371 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:19.372 INFO MemoryStore - Block broadcast_40 stored as values in memory (estimated size 320.0 B, free 1918.0 MiB)
19:48:19.373 INFO MemoryStore - Block broadcast_40_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.0 MiB)
19:48:19.373 INFO BlockManagerInfo - Added broadcast_40_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:48:19.374 INFO SparkContext - Created broadcast 40 from broadcast at BamSource.java:104
19:48:19.376 INFO MemoryStore - Block broadcast_41 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
19:48:19.389 INFO MemoryStore - Block broadcast_41_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:19.390 INFO BlockManagerInfo - Added broadcast_41_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:19.390 INFO SparkContext - Created broadcast 41 from newAPIHadoopFile at PathSplitSource.java:96
19:48:19.415 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.415 INFO FileInputFormat - Total input files to process : 1
19:48:19.416 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.420 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741826_1002 replica FinalizedReplica, blk_1073741826_1002, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741826 for deletion
19:48:19.421 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741826_1002 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741826
19:48:19.448 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:19.449 INFO DAGScheduler - Got job 21 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:19.449 INFO DAGScheduler - Final stage: ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:19.449 INFO DAGScheduler - Parents of final stage: List()
19:48:19.449 INFO DAGScheduler - Missing parents: List()
19:48:19.449 INFO DAGScheduler - Submitting ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:19.457 INFO MemoryStore - Block broadcast_42 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
19:48:19.458 INFO MemoryStore - Block broadcast_42_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
19:48:19.459 INFO BlockManagerInfo - Added broadcast_42_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:19.459 INFO SparkContext - Created broadcast 42 from broadcast at DAGScheduler.scala:1580
19:48:19.459 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:19.459 INFO TaskSchedulerImpl - Adding task set 31.0 with 1 tasks resource profile 0
19:48:19.460 INFO TaskSetManager - Starting task 0.0 in stage 31.0 (TID 69) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:19.460 INFO Executor - Running task 0.0 in stage 31.0 (TID 69)
19:48:19.480 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam:0+237038
19:48:19.481 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.484 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.485 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.486 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.488 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:19.497 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:19.497 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:19.507 INFO Executor - Finished task 0.0 in stage 31.0 (TID 69). 651526 bytes result sent to driver
19:48:19.512 INFO TaskSetManager - Finished task 0.0 in stage 31.0 (TID 69) in 52 ms on localhost (executor driver) (1/1)
19:48:19.512 INFO TaskSchedulerImpl - Removed TaskSet 31.0, whose tasks have all completed, from pool
19:48:19.513 INFO DAGScheduler - ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.062 s
19:48:19.513 INFO DAGScheduler - Job 21 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:19.513 INFO TaskSchedulerImpl - Killing all running tasks in stage 31: Stage finished
19:48:19.513 INFO DAGScheduler - Job 21 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.064506 s
19:48:19.541 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:19.541 INFO DAGScheduler - Got job 22 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:19.541 INFO DAGScheduler - Final stage: ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185)
19:48:19.541 INFO DAGScheduler - Parents of final stage: List()
19:48:19.541 INFO DAGScheduler - Missing parents: List()
19:48:19.542 INFO DAGScheduler - Submitting ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:19.564 INFO MemoryStore - Block broadcast_43 stored as values in memory (estimated size 426.1 KiB, free 1917.1 MiB)
19:48:19.566 INFO MemoryStore - Block broadcast_43_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
19:48:19.566 INFO BlockManagerInfo - Added broadcast_43_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.4 MiB)
19:48:19.566 INFO SparkContext - Created broadcast 43 from broadcast at DAGScheduler.scala:1580
19:48:19.567 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:19.567 INFO TaskSchedulerImpl - Adding task set 32.0 with 1 tasks resource profile 0
19:48:19.567 INFO TaskSetManager - Starting task 0.0 in stage 32.0 (TID 70) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:19.568 INFO Executor - Running task 0.0 in stage 32.0 (TID 70)
19:48:19.609 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:19.624 INFO Executor - Finished task 0.0 in stage 32.0 (TID 70). 989 bytes result sent to driver
19:48:19.625 INFO TaskSetManager - Finished task 0.0 in stage 32.0 (TID 70) in 58 ms on localhost (executor driver) (1/1)
19:48:19.625 INFO TaskSchedulerImpl - Removed TaskSet 32.0, whose tasks have all completed, from pool
19:48:19.625 INFO DAGScheduler - ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.083 s
19:48:19.626 INFO DAGScheduler - Job 22 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:19.626 INFO TaskSchedulerImpl - Killing all running tasks in stage 32: Stage finished
19:48:19.626 INFO DAGScheduler - Job 22 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.084927 s
19:48:19.635 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:19.636 INFO DAGScheduler - Got job 23 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:19.636 INFO DAGScheduler - Final stage: ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185)
19:48:19.636 INFO DAGScheduler - Parents of final stage: List()
19:48:19.636 INFO DAGScheduler - Missing parents: List()
19:48:19.636 INFO DAGScheduler - Submitting ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:19.644 INFO MemoryStore - Block broadcast_44 stored as values in memory (estimated size 148.1 KiB, free 1916.8 MiB)
19:48:19.645 INFO MemoryStore - Block broadcast_44_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.7 MiB)
19:48:19.645 INFO BlockManagerInfo - Added broadcast_44_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.3 MiB)
19:48:19.645 INFO SparkContext - Created broadcast 44 from broadcast at DAGScheduler.scala:1580
19:48:19.646 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:19.646 INFO TaskSchedulerImpl - Adding task set 33.0 with 1 tasks resource profile 0
19:48:19.646 INFO TaskSetManager - Starting task 0.0 in stage 33.0 (TID 71) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:19.647 INFO Executor - Running task 0.0 in stage 33.0 (TID 71)
19:48:19.661 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam:0+237038
19:48:19.662 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.663 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam dst=null perm=null proto=rpc
19:48:19.665 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.665 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.666 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_80817d3e-85bb-4e1a-b212-5da91ff1be81.bam.bai dst=null perm=null proto=rpc
19:48:19.668 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:19.670 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:19.671 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:19.674 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:19.676 INFO Executor - Finished task 0.0 in stage 33.0 (TID 71). 989 bytes result sent to driver
19:48:19.677 INFO TaskSetManager - Finished task 0.0 in stage 33.0 (TID 71) in 31 ms on localhost (executor driver) (1/1)
19:48:19.677 INFO TaskSchedulerImpl - Removed TaskSet 33.0, whose tasks have all completed, from pool
19:48:19.677 INFO DAGScheduler - ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.041 s
19:48:19.677 INFO DAGScheduler - Job 23 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:19.677 INFO TaskSchedulerImpl - Killing all running tasks in stage 33: Stage finished
19:48:19.677 INFO DAGScheduler - Job 23 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.042039 s
19:48:19.684 INFO MemoryStore - Block broadcast_45 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:19.697 INFO MemoryStore - Block broadcast_45_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:19.697 INFO BlockManagerInfo - Added broadcast_45_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:19.697 INFO SparkContext - Created broadcast 45 from newAPIHadoopFile at PathSplitSource.java:96
19:48:19.734 INFO MemoryStore - Block broadcast_46 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
19:48:19.746 INFO MemoryStore - Block broadcast_46_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
19:48:19.746 INFO BlockManagerInfo - Added broadcast_46_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:19.746 INFO SparkContext - Created broadcast 46 from newAPIHadoopFile at PathSplitSource.java:96
19:48:19.775 INFO FileInputFormat - Total input files to process : 1
19:48:19.782 INFO MemoryStore - Block broadcast_47 stored as values in memory (estimated size 160.7 KiB, free 1915.9 MiB)
19:48:19.801 INFO MemoryStore - Block broadcast_47_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.9 MiB)
19:48:19.801 INFO BlockManagerInfo - Added broadcast_47_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.2 MiB)
19:48:19.802 INFO SparkContext - Created broadcast 47 from broadcast at ReadsSparkSink.java:133
19:48:19.802 INFO BlockManagerInfo - Removed broadcast_37_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:48:19.804 INFO BlockManagerInfo - Removed broadcast_41_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:19.807 INFO MemoryStore - Block broadcast_48 stored as values in memory (estimated size 163.2 KiB, free 1916.2 MiB)
19:48:19.808 INFO BlockManagerInfo - Removed broadcast_40_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.3 MiB)
19:48:19.810 INFO BlockManagerInfo - Removed broadcast_34_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:19.811 INFO BlockManagerInfo - Removed broadcast_39_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.4 MiB)
19:48:19.812 INFO MemoryStore - Block broadcast_48_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
19:48:19.812 INFO BlockManagerInfo - Added broadcast_48_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:48:19.812 INFO SparkContext - Created broadcast 48 from broadcast at BamSink.java:76
19:48:19.815 INFO BlockManagerInfo - Removed broadcast_46_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:19.816 INFO BlockManagerInfo - Removed broadcast_36_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:19.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts dst=null perm=null proto=rpc
19:48:19.818 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:19.818 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:19.818 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:19.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:19.821 INFO BlockManagerInfo - Removed broadcast_38_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.6 MiB)
19:48:19.823 INFO BlockManagerInfo - Removed broadcast_43_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:19.826 INFO BlockManagerInfo - Removed broadcast_35_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:19.827 INFO BlockManagerInfo - Removed broadcast_42_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.9 MiB)
19:48:19.828 INFO BlockManagerInfo - Removed broadcast_44_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.9 MiB)
19:48:19.832 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:19.833 INFO DAGScheduler - Registering RDD 102 (mapToPair at SparkUtils.java:161) as input to shuffle 8
19:48:19.833 INFO DAGScheduler - Got job 24 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:19.833 INFO DAGScheduler - Final stage: ResultStage 35 (runJob at SparkHadoopWriter.scala:83)
19:48:19.833 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 34)
19:48:19.833 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 34)
19:48:19.833 INFO DAGScheduler - Submitting ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:19.872 INFO MemoryStore - Block broadcast_49 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
19:48:19.874 INFO MemoryStore - Block broadcast_49_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
19:48:19.874 INFO BlockManagerInfo - Added broadcast_49_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.8 MiB)
19:48:19.874 INFO SparkContext - Created broadcast 49 from broadcast at DAGScheduler.scala:1580
19:48:19.875 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:19.875 INFO TaskSchedulerImpl - Adding task set 34.0 with 1 tasks resource profile 0
19:48:19.876 INFO TaskSetManager - Starting task 0.0 in stage 34.0 (TID 72) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:19.876 INFO Executor - Running task 0.0 in stage 34.0 (TID 72)
19:48:19.941 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:19.977 INFO Executor - Finished task 0.0 in stage 34.0 (TID 72). 1148 bytes result sent to driver
19:48:19.977 INFO TaskSetManager - Finished task 0.0 in stage 34.0 (TID 72) in 101 ms on localhost (executor driver) (1/1)
19:48:19.977 INFO TaskSchedulerImpl - Removed TaskSet 34.0, whose tasks have all completed, from pool
19:48:19.978 INFO DAGScheduler - ShuffleMapStage 34 (mapToPair at SparkUtils.java:161) finished in 0.144 s
19:48:19.978 INFO DAGScheduler - looking for newly runnable stages
19:48:19.978 INFO DAGScheduler - running: HashSet()
19:48:19.978 INFO DAGScheduler - waiting: HashSet(ResultStage 35)
19:48:19.978 INFO DAGScheduler - failed: HashSet()
19:48:19.978 INFO DAGScheduler - Submitting ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91), which has no missing parents
19:48:19.989 INFO MemoryStore - Block broadcast_50 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
19:48:19.990 INFO MemoryStore - Block broadcast_50_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
19:48:19.990 INFO BlockManagerInfo - Added broadcast_50_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.7 MiB)
19:48:19.991 INFO SparkContext - Created broadcast 50 from broadcast at DAGScheduler.scala:1580
19:48:19.991 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:19.991 INFO TaskSchedulerImpl - Adding task set 35.0 with 1 tasks resource profile 0
19:48:19.992 INFO TaskSetManager - Starting task 0.0 in stage 35.0 (TID 73) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:19.992 INFO Executor - Running task 0.0 in stage 35.0 (TID 73)
19:48:20.000 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:20.000 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:20.027 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:20.027 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:20.027 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:20.028 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:20.028 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:20.028 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:20.029 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.032 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.036 INFO StateChange - BLOCK* allocate blk_1073741832_1008, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/part-r-00000
19:48:20.038 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741832_1008 src: /127.0.0.1:45230 dest: /127.0.0.1:45925
19:48:20.043 INFO clienttrace - src: /127.0.0.1:45230, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741832_1008, duration(ns): 4062562
19:48:20.043 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741832_1008, type=LAST_IN_PIPELINE terminating
19:48:20.045 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:20.046 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:20.047 INFO StateChange - BLOCK* allocate blk_1073741833_1009, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.sbi
19:48:20.048 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741833_1009 src: /127.0.0.1:45240 dest: /127.0.0.1:45925
19:48:20.049 INFO clienttrace - src: /127.0.0.1:45240, dest: /127.0.0.1:45925, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741833_1009, duration(ns): 279855
19:48:20.049 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741833_1009, type=LAST_IN_PIPELINE terminating
19:48:20.050 INFO FSNamesystem - BLOCK* blk_1073741833_1009 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.sbi
19:48:20.451 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:20.454 INFO StateChange - BLOCK* allocate blk_1073741834_1010, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.bai
19:48:20.456 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741834_1010 src: /127.0.0.1:45248 dest: /127.0.0.1:45925
19:48:20.457 INFO clienttrace - src: /127.0.0.1:45248, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741834_1010, duration(ns): 678453
19:48:20.457 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741834_1010, type=LAST_IN_PIPELINE terminating
19:48:20.458 INFO FSNamesystem - BLOCK* blk_1073741834_1010 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.bai
19:48:20.859 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:20.860 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0 dst=null perm=null proto=rpc
19:48:20.861 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0 dst=null perm=null proto=rpc
19:48:20.862 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/task_202507151948196496216050568958517_0107_r_000000 dst=null perm=null proto=rpc
19:48:20.863 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/_temporary/attempt_202507151948196496216050568958517_0107_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/task_202507151948196496216050568958517_0107_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:20.864 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948196496216050568958517_0107_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/task_202507151948196496216050568958517_0107_r_000000
19:48:20.864 INFO SparkHadoopMapRedUtil - attempt_202507151948196496216050568958517_0107_r_000000_0: Committed. Elapsed time: 3 ms.
19:48:20.865 INFO Executor - Finished task 0.0 in stage 35.0 (TID 73). 1858 bytes result sent to driver
19:48:20.866 INFO TaskSetManager - Finished task 0.0 in stage 35.0 (TID 73) in 874 ms on localhost (executor driver) (1/1)
19:48:20.866 INFO TaskSchedulerImpl - Removed TaskSet 35.0, whose tasks have all completed, from pool
19:48:20.867 INFO DAGScheduler - ResultStage 35 (runJob at SparkHadoopWriter.scala:83) finished in 0.889 s
19:48:20.867 INFO DAGScheduler - Job 24 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:20.867 INFO TaskSchedulerImpl - Killing all running tasks in stage 35: Stage finished
19:48:20.867 INFO DAGScheduler - Job 24 finished: runJob at SparkHadoopWriter.scala:83, took 1.034915 s
19:48:20.868 INFO SparkHadoopWriter - Start to commit write Job job_202507151948196496216050568958517_0107.
19:48:20.869 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:20.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts dst=null perm=null proto=rpc
19:48:20.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/task_202507151948196496216050568958517_0107_r_000000 dst=null perm=null proto=rpc
19:48:20.871 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:20.872 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/task_202507151948196496216050568958517_0107_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.873 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:20.873 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/task_202507151948196496216050568958517_0107_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.874 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:20.875 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary/0/task_202507151948196496216050568958517_0107_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.876 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:20.876 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.877 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:20.878 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.spark-staging-107 dst=null perm=null proto=rpc
19:48:20.878 INFO SparkHadoopWriter - Write Job job_202507151948196496216050568958517_0107 committed. Elapsed time: 10 ms.
19:48:20.879 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:20.881 INFO StateChange - BLOCK* allocate blk_1073741835_1011, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/header
19:48:20.882 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741835_1011 src: /127.0.0.1:45260 dest: /127.0.0.1:45925
19:48:20.884 INFO clienttrace - src: /127.0.0.1:45260, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741835_1011, duration(ns): 552866
19:48:20.884 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741835_1011, type=LAST_IN_PIPELINE terminating
19:48:20.885 INFO FSNamesystem - BLOCK* blk_1073741835_1011 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/header
19:48:21.286 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:21.287 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:21.288 INFO StateChange - BLOCK* allocate blk_1073741836_1012, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/terminator
19:48:21.289 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741836_1012 src: /127.0.0.1:45270 dest: /127.0.0.1:45925
19:48:21.290 INFO clienttrace - src: /127.0.0.1:45270, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741836_1012, duration(ns): 498755
19:48:21.291 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741836_1012, type=LAST_IN_PIPELINE terminating
19:48:21.292 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:21.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts dst=null perm=null proto=rpc
19:48:21.294 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:21.295 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:21.295 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam
19:48:21.296 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:21.297 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:21.298 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:21.298 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam done
19:48:21.298 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:21.298 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi
19:48:21.299 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts dst=null perm=null proto=rpc
19:48:21.300 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:21.301 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:21.302 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:21.303 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
19:48:21.304 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:21.305 INFO StateChange - BLOCK* allocate blk_1073741837_1013, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi
19:48:21.306 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741837_1013 src: /127.0.0.1:45278 dest: /127.0.0.1:45925
19:48:21.307 INFO clienttrace - src: /127.0.0.1:45278, dest: /127.0.0.1:45925, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741837_1013, duration(ns): 583810
19:48:21.307 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741837_1013, type=LAST_IN_PIPELINE terminating
19:48:21.308 INFO FSNamesystem - BLOCK* blk_1073741837_1013 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi
19:48:21.709 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:21.710 INFO IndexFileMerger - Done merging .sbi files
19:48:21.710 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai
19:48:21.710 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts dst=null perm=null proto=rpc
19:48:21.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:21.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:21.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:21.715 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:21.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:21.722 INFO StateChange - BLOCK* allocate blk_1073741838_1014, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai
19:48:21.723 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741838_1014 src: /127.0.0.1:45282 dest: /127.0.0.1:45925
19:48:21.724 INFO clienttrace - src: /127.0.0.1:45282, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741838_1014, duration(ns): 712742
19:48:21.724 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741838_1014, type=LAST_IN_PIPELINE terminating
19:48:21.725 INFO FSNamesystem - BLOCK* blk_1073741838_1014 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai
19:48:22.126 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:22.127 INFO IndexFileMerger - Done merging .bai files
19:48:22.128 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.parts dst=null perm=null proto=rpc
19:48:22.137 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.149 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi dst=null perm=null proto=rpc
19:48:22.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi dst=null perm=null proto=rpc
19:48:22.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi dst=null perm=null proto=rpc
19:48:22.152 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.153 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.155 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.156 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.157 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.158 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.159 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:22.162 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:22.163 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:22.163 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:22.164 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi dst=null perm=null proto=rpc
19:48:22.164 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi dst=null perm=null proto=rpc
19:48:22.165 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.sbi dst=null perm=null proto=rpc
19:48:22.166 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
19:48:22.167 INFO MemoryStore - Block broadcast_51 stored as values in memory (estimated size 13.3 KiB, free 1918.3 MiB)
19:48:22.168 INFO MemoryStore - Block broadcast_51_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.3 MiB)
19:48:22.168 INFO BlockManagerInfo - Added broadcast_51_piece0 in memory on localhost:36125 (size: 8.3 KiB, free: 1919.7 MiB)
19:48:22.168 INFO SparkContext - Created broadcast 51 from broadcast at BamSource.java:104
19:48:22.170 INFO MemoryStore - Block broadcast_52 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
19:48:22.183 INFO MemoryStore - Block broadcast_52_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:22.183 INFO BlockManagerInfo - Added broadcast_52_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:22.183 INFO SparkContext - Created broadcast 52 from newAPIHadoopFile at PathSplitSource.java:96
19:48:22.201 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.202 INFO FileInputFormat - Total input files to process : 1
19:48:22.202 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.222 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:22.223 INFO DAGScheduler - Got job 25 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:22.223 INFO DAGScheduler - Final stage: ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:22.223 INFO DAGScheduler - Parents of final stage: List()
19:48:22.223 INFO DAGScheduler - Missing parents: List()
19:48:22.223 INFO DAGScheduler - Submitting ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:22.235 INFO MemoryStore - Block broadcast_53 stored as values in memory (estimated size 148.2 KiB, free 1917.8 MiB)
19:48:22.236 INFO MemoryStore - Block broadcast_53_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
19:48:22.236 INFO BlockManagerInfo - Added broadcast_53_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:22.237 INFO SparkContext - Created broadcast 53 from broadcast at DAGScheduler.scala:1580
19:48:22.237 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:22.237 INFO TaskSchedulerImpl - Adding task set 36.0 with 1 tasks resource profile 0
19:48:22.238 INFO TaskSetManager - Starting task 0.0 in stage 36.0 (TID 74) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:22.238 INFO Executor - Running task 0.0 in stage 36.0 (TID 74)
19:48:22.252 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam:0+237038
19:48:22.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.254 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.256 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.257 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.257 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.259 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:22.262 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:22.263 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:22.266 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:22.266 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:22.272 INFO Executor - Finished task 0.0 in stage 36.0 (TID 74). 651526 bytes result sent to driver
19:48:22.277 INFO TaskSetManager - Finished task 0.0 in stage 36.0 (TID 74) in 38 ms on localhost (executor driver) (1/1)
19:48:22.277 INFO TaskSchedulerImpl - Removed TaskSet 36.0, whose tasks have all completed, from pool
19:48:22.277 INFO DAGScheduler - ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.054 s
19:48:22.277 INFO DAGScheduler - Job 25 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:22.277 INFO TaskSchedulerImpl - Killing all running tasks in stage 36: Stage finished
19:48:22.277 INFO DAGScheduler - Job 25 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.055070 s
19:48:22.296 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:22.296 INFO DAGScheduler - Got job 26 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:22.296 INFO DAGScheduler - Final stage: ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185)
19:48:22.296 INFO DAGScheduler - Parents of final stage: List()
19:48:22.296 INFO DAGScheduler - Missing parents: List()
19:48:22.296 INFO DAGScheduler - Submitting ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:22.321 INFO MemoryStore - Block broadcast_54 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:22.323 INFO MemoryStore - Block broadcast_54_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:48:22.323 INFO BlockManagerInfo - Added broadcast_54_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.4 MiB)
19:48:22.323 INFO SparkContext - Created broadcast 54 from broadcast at DAGScheduler.scala:1580
19:48:22.324 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:22.324 INFO TaskSchedulerImpl - Adding task set 37.0 with 1 tasks resource profile 0
19:48:22.325 INFO TaskSetManager - Starting task 0.0 in stage 37.0 (TID 75) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:22.325 INFO Executor - Running task 0.0 in stage 37.0 (TID 75)
19:48:22.360 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:22.376 INFO Executor - Finished task 0.0 in stage 37.0 (TID 75). 989 bytes result sent to driver
19:48:22.377 INFO TaskSetManager - Finished task 0.0 in stage 37.0 (TID 75) in 53 ms on localhost (executor driver) (1/1)
19:48:22.377 INFO TaskSchedulerImpl - Removed TaskSet 37.0, whose tasks have all completed, from pool
19:48:22.377 INFO DAGScheduler - ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.080 s
19:48:22.377 INFO DAGScheduler - Job 26 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:22.377 INFO TaskSchedulerImpl - Killing all running tasks in stage 37: Stage finished
19:48:22.377 INFO DAGScheduler - Job 26 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.081787 s
19:48:22.384 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:22.384 INFO DAGScheduler - Got job 27 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:22.384 INFO DAGScheduler - Final stage: ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185)
19:48:22.384 INFO DAGScheduler - Parents of final stage: List()
19:48:22.385 INFO DAGScheduler - Missing parents: List()
19:48:22.385 INFO DAGScheduler - Submitting ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:22.395 INFO MemoryStore - Block broadcast_55 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:48:22.396 INFO MemoryStore - Block broadcast_55_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.0 MiB)
19:48:22.397 INFO BlockManagerInfo - Added broadcast_55_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.4 MiB)
19:48:22.397 INFO SparkContext - Created broadcast 55 from broadcast at DAGScheduler.scala:1580
19:48:22.397 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:22.397 INFO TaskSchedulerImpl - Adding task set 38.0 with 1 tasks resource profile 0
19:48:22.398 INFO TaskSetManager - Starting task 0.0 in stage 38.0 (TID 76) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:22.398 INFO Executor - Running task 0.0 in stage 38.0 (TID 76)
19:48:22.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741827_1003 replica FinalizedReplica, blk_1073741827_1003, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741827 for deletion
19:48:22.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741833_1009 replica FinalizedReplica, blk_1073741833_1009, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741833 for deletion
19:48:22.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741834_1010 replica FinalizedReplica, blk_1073741834_1010, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741834 for deletion
19:48:22.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741834_1010 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741834
19:48:22.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741827_1003 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741827
19:48:22.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741833_1009 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741833
19:48:22.418 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam:0+237038
19:48:22.418 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.419 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam dst=null perm=null proto=rpc
19:48:22.420 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.421 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.422 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c089c8ad-ad53-46bf-a2ed-30b350c851e2.bam.bai dst=null perm=null proto=rpc
19:48:22.424 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:22.426 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:22.427 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:22.429 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:22.429 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:22.432 INFO Executor - Finished task 0.0 in stage 38.0 (TID 76). 989 bytes result sent to driver
19:48:22.432 INFO TaskSetManager - Finished task 0.0 in stage 38.0 (TID 76) in 34 ms on localhost (executor driver) (1/1)
19:48:22.432 INFO TaskSchedulerImpl - Removed TaskSet 38.0, whose tasks have all completed, from pool
19:48:22.433 INFO DAGScheduler - ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.047 s
19:48:22.433 INFO DAGScheduler - Job 27 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:22.433 INFO TaskSchedulerImpl - Killing all running tasks in stage 38: Stage finished
19:48:22.433 INFO DAGScheduler - Job 27 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.048970 s
19:48:22.438 INFO MemoryStore - Block broadcast_56 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
19:48:22.449 INFO MemoryStore - Block broadcast_56_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:48:22.449 INFO BlockManagerInfo - Added broadcast_56_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:22.449 INFO SparkContext - Created broadcast 56 from newAPIHadoopFile at PathSplitSource.java:96
19:48:22.482 INFO MemoryStore - Block broadcast_57 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:22.489 INFO MemoryStore - Block broadcast_57_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:22.489 INFO BlockManagerInfo - Added broadcast_57_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:22.489 INFO SparkContext - Created broadcast 57 from newAPIHadoopFile at PathSplitSource.java:96
19:48:22.514 INFO FileInputFormat - Total input files to process : 1
19:48:22.517 INFO MemoryStore - Block broadcast_58 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
19:48:22.518 INFO MemoryStore - Block broadcast_58_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
19:48:22.519 INFO BlockManagerInfo - Added broadcast_58_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:22.519 INFO SparkContext - Created broadcast 58 from broadcast at ReadsSparkSink.java:133
19:48:22.520 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:22.521 INFO MemoryStore - Block broadcast_59 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
19:48:22.522 INFO MemoryStore - Block broadcast_59_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
19:48:22.522 INFO BlockManagerInfo - Added broadcast_59_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:22.523 INFO SparkContext - Created broadcast 59 from broadcast at BamSink.java:76
19:48:22.527 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts dst=null perm=null proto=rpc
19:48:22.527 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:22.527 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:22.527 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:22.529 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:22.539 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:22.540 INFO DAGScheduler - Registering RDD 127 (mapToPair at SparkUtils.java:161) as input to shuffle 9
19:48:22.540 INFO DAGScheduler - Got job 28 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:22.540 INFO DAGScheduler - Final stage: ResultStage 40 (runJob at SparkHadoopWriter.scala:83)
19:48:22.540 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 39)
19:48:22.541 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 39)
19:48:22.541 INFO DAGScheduler - Submitting ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:22.567 INFO MemoryStore - Block broadcast_60 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
19:48:22.569 INFO MemoryStore - Block broadcast_60_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.3 MiB)
19:48:22.569 INFO BlockManagerInfo - Added broadcast_60_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.1 MiB)
19:48:22.569 INFO SparkContext - Created broadcast 60 from broadcast at DAGScheduler.scala:1580
19:48:22.570 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:22.570 INFO TaskSchedulerImpl - Adding task set 39.0 with 1 tasks resource profile 0
19:48:22.570 INFO TaskSetManager - Starting task 0.0 in stage 39.0 (TID 77) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:22.571 INFO Executor - Running task 0.0 in stage 39.0 (TID 77)
19:48:22.608 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:22.629 INFO Executor - Finished task 0.0 in stage 39.0 (TID 77). 1148 bytes result sent to driver
19:48:22.630 INFO TaskSetManager - Finished task 0.0 in stage 39.0 (TID 77) in 60 ms on localhost (executor driver) (1/1)
19:48:22.630 INFO TaskSchedulerImpl - Removed TaskSet 39.0, whose tasks have all completed, from pool
19:48:22.630 INFO DAGScheduler - ShuffleMapStage 39 (mapToPair at SparkUtils.java:161) finished in 0.089 s
19:48:22.631 INFO DAGScheduler - looking for newly runnable stages
19:48:22.631 INFO DAGScheduler - running: HashSet()
19:48:22.631 INFO DAGScheduler - waiting: HashSet(ResultStage 40)
19:48:22.631 INFO DAGScheduler - failed: HashSet()
19:48:22.631 INFO DAGScheduler - Submitting ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91), which has no missing parents
19:48:22.638 INFO MemoryStore - Block broadcast_61 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
19:48:22.654 INFO MemoryStore - Block broadcast_61_piece0 stored as bytes in memory (estimated size 67.2 KiB, free 1915.0 MiB)
19:48:22.655 INFO BlockManagerInfo - Added broadcast_61_piece0 in memory on localhost:36125 (size: 67.2 KiB, free: 1919.0 MiB)
19:48:22.655 INFO BlockManagerInfo - Removed broadcast_49_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.2 MiB)
19:48:22.655 INFO SparkContext - Created broadcast 61 from broadcast at DAGScheduler.scala:1580
19:48:22.655 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:22.655 INFO TaskSchedulerImpl - Adding task set 40.0 with 1 tasks resource profile 0
19:48:22.656 INFO BlockManagerInfo - Removed broadcast_52_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:22.656 INFO TaskSetManager - Starting task 0.0 in stage 40.0 (TID 78) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:22.657 INFO Executor - Running task 0.0 in stage 40.0 (TID 78)
19:48:22.657 INFO BlockManagerInfo - Removed broadcast_55_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.3 MiB)
19:48:22.658 INFO BlockManagerInfo - Removed broadcast_54_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:22.659 INFO BlockManagerInfo - Removed broadcast_51_piece0 on localhost:36125 in memory (size: 8.3 KiB, free: 1919.5 MiB)
19:48:22.660 INFO BlockManagerInfo - Removed broadcast_47_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:22.661 INFO BlockManagerInfo - Removed broadcast_53_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.5 MiB)
19:48:22.661 INFO BlockManagerInfo - Removed broadcast_50_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.6 MiB)
19:48:22.662 INFO BlockManagerInfo - Removed broadcast_48_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:22.663 INFO BlockManagerInfo - Removed broadcast_45_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:22.664 INFO BlockManagerInfo - Removed broadcast_57_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:22.667 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:22.667 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:22.686 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:22.686 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:22.686 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:22.686 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:22.686 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:22.686 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:22.688 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:22.689 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:22.693 INFO StateChange - BLOCK* allocate blk_1073741839_1015, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/part-r-00000
19:48:22.694 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741839_1015 src: /127.0.0.1:45310 dest: /127.0.0.1:45925
19:48:22.699 INFO clienttrace - src: /127.0.0.1:45310, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741839_1015, duration(ns): 3465633
19:48:22.699 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741839_1015, type=LAST_IN_PIPELINE terminating
19:48:22.700 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:22.701 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:22.703 INFO StateChange - BLOCK* allocate blk_1073741840_1016, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/.part-r-00000.bai
19:48:22.704 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741840_1016 src: /127.0.0.1:45314 dest: /127.0.0.1:45925
19:48:22.706 INFO clienttrace - src: /127.0.0.1:45314, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741840_1016, duration(ns): 505227
19:48:22.706 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741840_1016, type=LAST_IN_PIPELINE terminating
19:48:22.706 INFO FSNamesystem - BLOCK* blk_1073741840_1016 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/.part-r-00000.bai
19:48:23.108 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:23.109 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0 dst=null perm=null proto=rpc
19:48:23.110 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0 dst=null perm=null proto=rpc
19:48:23.111 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/task_202507151948227583517652843136654_0132_r_000000 dst=null perm=null proto=rpc
19:48:23.112 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/_temporary/attempt_202507151948227583517652843136654_0132_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/task_202507151948227583517652843136654_0132_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:23.113 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948227583517652843136654_0132_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/task_202507151948227583517652843136654_0132_r_000000
19:48:23.113 INFO SparkHadoopMapRedUtil - attempt_202507151948227583517652843136654_0132_r_000000_0: Committed. Elapsed time: 2 ms.
19:48:23.114 INFO Executor - Finished task 0.0 in stage 40.0 (TID 78). 1858 bytes result sent to driver
19:48:23.115 INFO TaskSetManager - Finished task 0.0 in stage 40.0 (TID 78) in 459 ms on localhost (executor driver) (1/1)
19:48:23.115 INFO TaskSchedulerImpl - Removed TaskSet 40.0, whose tasks have all completed, from pool
19:48:23.115 INFO DAGScheduler - ResultStage 40 (runJob at SparkHadoopWriter.scala:83) finished in 0.484 s
19:48:23.116 INFO DAGScheduler - Job 28 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:23.116 INFO TaskSchedulerImpl - Killing all running tasks in stage 40: Stage finished
19:48:23.116 INFO DAGScheduler - Job 28 finished: runJob at SparkHadoopWriter.scala:83, took 0.576326 s
19:48:23.117 INFO SparkHadoopWriter - Start to commit write Job job_202507151948227583517652843136654_0132.
19:48:23.117 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:23.118 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts dst=null perm=null proto=rpc
19:48:23.119 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/task_202507151948227583517652843136654_0132_r_000000 dst=null perm=null proto=rpc
19:48:23.119 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:23.120 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/task_202507151948227583517652843136654_0132_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.121 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:23.121 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary/0/task_202507151948227583517652843136654_0132_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.122 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:23.123 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.124 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:23.125 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/.spark-staging-132 dst=null perm=null proto=rpc
19:48:23.125 INFO SparkHadoopWriter - Write Job job_202507151948227583517652843136654_0132 committed. Elapsed time: 8 ms.
19:48:23.126 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.128 INFO StateChange - BLOCK* allocate blk_1073741841_1017, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/header
19:48:23.129 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741841_1017 src: /127.0.0.1:45316 dest: /127.0.0.1:45925
19:48:23.131 INFO clienttrace - src: /127.0.0.1:45316, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741841_1017, duration(ns): 697192
19:48:23.131 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741841_1017, type=LAST_IN_PIPELINE terminating
19:48:23.132 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:23.133 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.134 INFO StateChange - BLOCK* allocate blk_1073741842_1018, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/terminator
19:48:23.135 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741842_1018 src: /127.0.0.1:45332 dest: /127.0.0.1:45925
19:48:23.136 INFO clienttrace - src: /127.0.0.1:45332, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741842_1018, duration(ns): 538859
19:48:23.136 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741842_1018, type=LAST_IN_PIPELINE terminating
19:48:23.137 INFO FSNamesystem - BLOCK* blk_1073741842_1018 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/terminator
19:48:23.538 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:23.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts dst=null perm=null proto=rpc
19:48:23.541 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.541 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:23.542 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam
19:48:23.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.543 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:23.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.544 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam done
19:48:23.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:23.544 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai
19:48:23.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts dst=null perm=null proto=rpc
19:48:23.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:23.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:23.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:23.549 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:23.552 INFO StateChange - BLOCK* allocate blk_1073741843_1019, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai
19:48:23.553 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741843_1019 src: /127.0.0.1:45336 dest: /127.0.0.1:45925
19:48:23.554 INFO clienttrace - src: /127.0.0.1:45336, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741843_1019, duration(ns): 560878
19:48:23.554 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741843_1019, type=LAST_IN_PIPELINE terminating
19:48:23.555 INFO FSNamesystem - BLOCK* blk_1073741843_1019 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai
19:48:23.956 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:23.957 INFO IndexFileMerger - Done merging .bai files
19:48:23.957 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.parts dst=null perm=null proto=rpc
19:48:23.968 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:23.969 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:23.969 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:23.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:23.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:23.972 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:23.973 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:23.973 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:23.978 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:23.979 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:23.979 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:23.979 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.sbi dst=null perm=null proto=rpc
19:48:23.982 INFO MemoryStore - Block broadcast_62 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:23.988 INFO MemoryStore - Block broadcast_62_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:23.988 INFO BlockManagerInfo - Added broadcast_62_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:23.989 INFO SparkContext - Created broadcast 62 from newAPIHadoopFile at PathSplitSource.java:96
19:48:24.011 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.011 INFO FileInputFormat - Total input files to process : 1
19:48:24.011 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.049 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:24.049 INFO DAGScheduler - Got job 29 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:24.049 INFO DAGScheduler - Final stage: ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:24.049 INFO DAGScheduler - Parents of final stage: List()
19:48:24.049 INFO DAGScheduler - Missing parents: List()
19:48:24.050 INFO DAGScheduler - Submitting ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:24.067 INFO MemoryStore - Block broadcast_63 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
19:48:24.069 INFO MemoryStore - Block broadcast_63_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
19:48:24.069 INFO BlockManagerInfo - Added broadcast_63_piece0 in memory on localhost:36125 (size: 153.7 KiB, free: 1919.5 MiB)
19:48:24.069 INFO SparkContext - Created broadcast 63 from broadcast at DAGScheduler.scala:1580
19:48:24.069 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:24.069 INFO TaskSchedulerImpl - Adding task set 41.0 with 1 tasks resource profile 0
19:48:24.070 INFO TaskSetManager - Starting task 0.0 in stage 41.0 (TID 79) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:24.071 INFO Executor - Running task 0.0 in stage 41.0 (TID 79)
19:48:24.112 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam:0+237038
19:48:24.114 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.114 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.116 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.117 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.117 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.118 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.119 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.119 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.121 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.124 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.124 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.125 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.125 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.126 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.127 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.132 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.133 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.134 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.135 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.138 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.139 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.140 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.141 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.142 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.143 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.144 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.146 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.147 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.148 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.149 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.150 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.151 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.152 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.153 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.155 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.156 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.156 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.157 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.158 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.159 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.160 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.161 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.162 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.163 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.164 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.165 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.166 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.167 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.168 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.170 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.171 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.172 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.173 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.173 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.174 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.175 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.176 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.177 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.178 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.179 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.181 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.182 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.183 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.184 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.184 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.185 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.186 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.187 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.188 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.189 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.190 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.191 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.192 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.193 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.194 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.196 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.196 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.198 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.198 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.201 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.204 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.204 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.207 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.207 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:24.213 INFO Executor - Finished task 0.0 in stage 41.0 (TID 79). 651526 bytes result sent to driver
19:48:24.215 INFO TaskSetManager - Finished task 0.0 in stage 41.0 (TID 79) in 145 ms on localhost (executor driver) (1/1)
19:48:24.215 INFO TaskSchedulerImpl - Removed TaskSet 41.0, whose tasks have all completed, from pool
19:48:24.216 INFO DAGScheduler - ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.165 s
19:48:24.216 INFO DAGScheduler - Job 29 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:24.216 INFO TaskSchedulerImpl - Killing all running tasks in stage 41: Stage finished
19:48:24.216 INFO DAGScheduler - Job 29 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.167021 s
19:48:24.235 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:24.236 INFO DAGScheduler - Got job 30 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:24.236 INFO DAGScheduler - Final stage: ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185)
19:48:24.236 INFO DAGScheduler - Parents of final stage: List()
19:48:24.236 INFO DAGScheduler - Missing parents: List()
19:48:24.236 INFO DAGScheduler - Submitting ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:24.260 INFO MemoryStore - Block broadcast_64 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
19:48:24.261 INFO MemoryStore - Block broadcast_64_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
19:48:24.262 INFO BlockManagerInfo - Added broadcast_64_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.4 MiB)
19:48:24.262 INFO SparkContext - Created broadcast 64 from broadcast at DAGScheduler.scala:1580
19:48:24.262 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:24.262 INFO TaskSchedulerImpl - Adding task set 42.0 with 1 tasks resource profile 0
19:48:24.263 INFO TaskSetManager - Starting task 0.0 in stage 42.0 (TID 80) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:24.263 INFO Executor - Running task 0.0 in stage 42.0 (TID 80)
19:48:24.298 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:24.310 INFO Executor - Finished task 0.0 in stage 42.0 (TID 80). 989 bytes result sent to driver
19:48:24.311 INFO TaskSetManager - Finished task 0.0 in stage 42.0 (TID 80) in 48 ms on localhost (executor driver) (1/1)
19:48:24.311 INFO TaskSchedulerImpl - Removed TaskSet 42.0, whose tasks have all completed, from pool
19:48:24.311 INFO DAGScheduler - ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.075 s
19:48:24.311 INFO DAGScheduler - Job 30 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:24.311 INFO TaskSchedulerImpl - Killing all running tasks in stage 42: Stage finished
19:48:24.311 INFO DAGScheduler - Job 30 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.075762 s
19:48:24.315 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:24.315 INFO DAGScheduler - Got job 31 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:24.315 INFO DAGScheduler - Final stage: ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185)
19:48:24.315 INFO DAGScheduler - Parents of final stage: List()
19:48:24.315 INFO DAGScheduler - Missing parents: List()
19:48:24.315 INFO DAGScheduler - Submitting ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:24.333 INFO MemoryStore - Block broadcast_65 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
19:48:24.334 INFO MemoryStore - Block broadcast_65_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
19:48:24.334 INFO BlockManagerInfo - Added broadcast_65_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.2 MiB)
19:48:24.335 INFO SparkContext - Created broadcast 65 from broadcast at DAGScheduler.scala:1580
19:48:24.335 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:24.335 INFO TaskSchedulerImpl - Adding task set 43.0 with 1 tasks resource profile 0
19:48:24.336 INFO TaskSetManager - Starting task 0.0 in stage 43.0 (TID 81) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:24.336 INFO Executor - Running task 0.0 in stage 43.0 (TID 81)
19:48:24.369 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam:0+237038
19:48:24.370 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.370 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.372 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.373 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.374 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.377 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.379 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.380 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.380 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.383 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.388 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.389 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.391 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.392 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.393 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.394 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.396 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.397 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.398 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.399 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.400 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.401 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.402 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.403 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.404 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.405 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.406 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.407 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.408 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.409 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.409 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.410 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.411 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.412 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.413 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.415 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.416 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.417 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.418 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.419 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.420 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.421 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.422 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.424 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.425 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.426 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.427 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.429 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.430 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.431 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.432 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.433 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.434 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.435 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.435 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.436 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.437 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.438 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.439 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.439 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.440 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.441 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.442 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.444 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.444 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:24.445 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:24.447 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.447 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam dst=null perm=null proto=rpc
19:48:24.449 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.449 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.450 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7a26c186-7687-4563-b798-9d35dd7b3f05.bam.bai dst=null perm=null proto=rpc
19:48:24.452 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:24.454 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.455 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:24.457 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:24.459 INFO Executor - Finished task 0.0 in stage 43.0 (TID 81). 989 bytes result sent to driver
19:48:24.460 INFO TaskSetManager - Finished task 0.0 in stage 43.0 (TID 81) in 124 ms on localhost (executor driver) (1/1)
19:48:24.460 INFO TaskSchedulerImpl - Removed TaskSet 43.0, whose tasks have all completed, from pool
19:48:24.460 INFO DAGScheduler - ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.144 s
19:48:24.460 INFO DAGScheduler - Job 31 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:24.460 INFO TaskSchedulerImpl - Killing all running tasks in stage 43: Stage finished
19:48:24.460 INFO DAGScheduler - Job 31 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.145510 s
19:48:24.465 INFO MemoryStore - Block broadcast_66 stored as values in memory (estimated size 297.9 KiB, free 1916.0 MiB)
19:48:24.476 INFO MemoryStore - Block broadcast_66_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
19:48:24.476 INFO BlockManagerInfo - Added broadcast_66_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:24.476 INFO SparkContext - Created broadcast 66 from newAPIHadoopFile at PathSplitSource.java:96
19:48:24.501 INFO MemoryStore - Block broadcast_67 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
19:48:24.507 INFO MemoryStore - Block broadcast_67_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
19:48:24.508 INFO BlockManagerInfo - Added broadcast_67_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.1 MiB)
19:48:24.508 INFO SparkContext - Created broadcast 67 from newAPIHadoopFile at PathSplitSource.java:96
19:48:24.530 INFO FileInputFormat - Total input files to process : 1
19:48:24.533 INFO MemoryStore - Block broadcast_68 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
19:48:24.534 INFO MemoryStore - Block broadcast_68_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
19:48:24.535 INFO BlockManagerInfo - Added broadcast_68_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:24.535 INFO SparkContext - Created broadcast 68 from broadcast at ReadsSparkSink.java:133
19:48:24.536 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:24.537 INFO MemoryStore - Block broadcast_69 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
19:48:24.538 INFO MemoryStore - Block broadcast_69_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
19:48:24.538 INFO BlockManagerInfo - Added broadcast_69_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:24.539 INFO SparkContext - Created broadcast 69 from broadcast at BamSink.java:76
19:48:24.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts dst=null perm=null proto=rpc
19:48:24.542 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:24.542 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:24.542 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:24.543 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:24.549 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:24.550 INFO DAGScheduler - Registering RDD 153 (mapToPair at SparkUtils.java:161) as input to shuffle 10
19:48:24.550 INFO DAGScheduler - Got job 32 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:24.550 INFO DAGScheduler - Final stage: ResultStage 45 (runJob at SparkHadoopWriter.scala:83)
19:48:24.550 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 44)
19:48:24.550 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 44)
19:48:24.551 INFO DAGScheduler - Submitting ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:24.568 INFO MemoryStore - Block broadcast_70 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
19:48:24.570 INFO MemoryStore - Block broadcast_70_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
19:48:24.570 INFO BlockManagerInfo - Added broadcast_70_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1918.9 MiB)
19:48:24.570 INFO SparkContext - Created broadcast 70 from broadcast at DAGScheduler.scala:1580
19:48:24.570 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:24.571 INFO TaskSchedulerImpl - Adding task set 44.0 with 1 tasks resource profile 0
19:48:24.572 INFO TaskSetManager - Starting task 0.0 in stage 44.0 (TID 82) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:24.572 INFO Executor - Running task 0.0 in stage 44.0 (TID 82)
19:48:24.591 INFO BlockManagerInfo - Removed broadcast_62_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.0 MiB)
19:48:24.591 INFO BlockManagerInfo - Removed broadcast_61_piece0 on localhost:36125 in memory (size: 67.2 KiB, free: 1919.0 MiB)
19:48:24.592 INFO BlockManagerInfo - Removed broadcast_63_piece0 on localhost:36125 in memory (size: 153.7 KiB, free: 1919.2 MiB)
19:48:24.593 INFO BlockManagerInfo - Removed broadcast_64_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:24.595 INFO BlockManagerInfo - Removed broadcast_59_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:24.595 INFO BlockManagerInfo - Removed broadcast_67_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:24.596 INFO BlockManagerInfo - Removed broadcast_65_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:24.598 INFO BlockManagerInfo - Removed broadcast_58_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:24.598 INFO BlockManagerInfo - Removed broadcast_60_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.7 MiB)
19:48:24.599 INFO BlockManagerInfo - Removed broadcast_56_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:24.618 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:24.638 INFO Executor - Finished task 0.0 in stage 44.0 (TID 82). 1191 bytes result sent to driver
19:48:24.639 INFO TaskSetManager - Finished task 0.0 in stage 44.0 (TID 82) in 68 ms on localhost (executor driver) (1/1)
19:48:24.639 INFO TaskSchedulerImpl - Removed TaskSet 44.0, whose tasks have all completed, from pool
19:48:24.639 INFO DAGScheduler - ShuffleMapStage 44 (mapToPair at SparkUtils.java:161) finished in 0.088 s
19:48:24.639 INFO DAGScheduler - looking for newly runnable stages
19:48:24.639 INFO DAGScheduler - running: HashSet()
19:48:24.639 INFO DAGScheduler - waiting: HashSet(ResultStage 45)
19:48:24.639 INFO DAGScheduler - failed: HashSet()
19:48:24.640 INFO DAGScheduler - Submitting ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91), which has no missing parents
19:48:24.647 INFO MemoryStore - Block broadcast_71 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
19:48:24.648 INFO MemoryStore - Block broadcast_71_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
19:48:24.648 INFO BlockManagerInfo - Added broadcast_71_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.7 MiB)
19:48:24.648 INFO SparkContext - Created broadcast 71 from broadcast at DAGScheduler.scala:1580
19:48:24.648 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:24.648 INFO TaskSchedulerImpl - Adding task set 45.0 with 1 tasks resource profile 0
19:48:24.649 INFO TaskSetManager - Starting task 0.0 in stage 45.0 (TID 83) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:24.650 INFO Executor - Running task 0.0 in stage 45.0 (TID 83)
19:48:24.655 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:24.655 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:24.670 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:24.670 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:24.670 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:24.671 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:24.671 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:24.671 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:24.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:24.673 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:24.677 INFO StateChange - BLOCK* allocate blk_1073741844_1020, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/part-r-00000
19:48:24.679 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741844_1020 src: /127.0.0.1:46066 dest: /127.0.0.1:45925
19:48:24.683 INFO clienttrace - src: /127.0.0.1:46066, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741844_1020, duration(ns): 2986125
19:48:24.683 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741844_1020, type=LAST_IN_PIPELINE terminating
19:48:24.683 INFO FSNamesystem - BLOCK* blk_1073741844_1020 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/part-r-00000
19:48:25.085 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:25.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:25.087 INFO StateChange - BLOCK* allocate blk_1073741845_1021, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/.part-r-00000.sbi
19:48:25.088 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741845_1021 src: /127.0.0.1:46070 dest: /127.0.0.1:45925
19:48:25.089 INFO clienttrace - src: /127.0.0.1:46070, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741845_1021, duration(ns): 451576
19:48:25.089 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741845_1021, type=LAST_IN_PIPELINE terminating
19:48:25.089 INFO FSNamesystem - BLOCK* blk_1073741845_1021 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/.part-r-00000.sbi
19:48:25.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741840_1016 replica FinalizedReplica, blk_1073741840_1016, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741840 for deletion
19:48:25.414 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741840_1016 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741840
19:48:25.490 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:25.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0 dst=null perm=null proto=rpc
19:48:25.492 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0 dst=null perm=null proto=rpc
19:48:25.492 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/task_20250715194824261512958861139339_0158_r_000000 dst=null perm=null proto=rpc
19:48:25.493 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/_temporary/attempt_20250715194824261512958861139339_0158_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/task_20250715194824261512958861139339_0158_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:25.493 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194824261512958861139339_0158_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/task_20250715194824261512958861139339_0158_r_000000
19:48:25.494 INFO SparkHadoopMapRedUtil - attempt_20250715194824261512958861139339_0158_r_000000_0: Committed. Elapsed time: 2 ms.
19:48:25.495 INFO Executor - Finished task 0.0 in stage 45.0 (TID 83). 1858 bytes result sent to driver
19:48:25.496 INFO TaskSetManager - Finished task 0.0 in stage 45.0 (TID 83) in 847 ms on localhost (executor driver) (1/1)
19:48:25.496 INFO TaskSchedulerImpl - Removed TaskSet 45.0, whose tasks have all completed, from pool
19:48:25.496 INFO DAGScheduler - ResultStage 45 (runJob at SparkHadoopWriter.scala:83) finished in 0.856 s
19:48:25.496 INFO DAGScheduler - Job 32 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:25.496 INFO TaskSchedulerImpl - Killing all running tasks in stage 45: Stage finished
19:48:25.496 INFO DAGScheduler - Job 32 finished: runJob at SparkHadoopWriter.scala:83, took 0.947076 s
19:48:25.498 INFO SparkHadoopWriter - Start to commit write Job job_20250715194824261512958861139339_0158.
19:48:25.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:25.499 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts dst=null perm=null proto=rpc
19:48:25.500 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/task_20250715194824261512958861139339_0158_r_000000 dst=null perm=null proto=rpc
19:48:25.501 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:25.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/task_20250715194824261512958861139339_0158_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:25.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary/0/task_20250715194824261512958861139339_0158_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:25.504 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.505 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:25.506 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/.spark-staging-158 dst=null perm=null proto=rpc
19:48:25.506 INFO SparkHadoopWriter - Write Job job_20250715194824261512958861139339_0158 committed. Elapsed time: 8 ms.
19:48:25.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.509 INFO StateChange - BLOCK* allocate blk_1073741846_1022, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/header
19:48:25.510 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741846_1022 src: /127.0.0.1:46078 dest: /127.0.0.1:45925
19:48:25.511 INFO clienttrace - src: /127.0.0.1:46078, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741846_1022, duration(ns): 480215
19:48:25.511 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741846_1022, type=LAST_IN_PIPELINE terminating
19:48:25.512 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:25.513 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.514 INFO StateChange - BLOCK* allocate blk_1073741847_1023, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/terminator
19:48:25.515 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741847_1023 src: /127.0.0.1:46088 dest: /127.0.0.1:45925
19:48:25.516 INFO clienttrace - src: /127.0.0.1:46088, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741847_1023, duration(ns): 379241
19:48:25.516 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741847_1023, type=LAST_IN_PIPELINE terminating
19:48:25.517 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:25.518 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts dst=null perm=null proto=rpc
19:48:25.519 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.520 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:25.520 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam
19:48:25.521 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.521 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.522 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.522 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam done
19:48:25.522 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.523 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi
19:48:25.523 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts dst=null perm=null proto=rpc
19:48:25.524 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:25.525 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:25.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:25.527 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:25.528 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:25.528 INFO StateChange - BLOCK* allocate blk_1073741848_1024, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi
19:48:25.529 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741848_1024 src: /127.0.0.1:46096 dest: /127.0.0.1:45925
19:48:25.530 INFO clienttrace - src: /127.0.0.1:46096, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741848_1024, duration(ns): 383699
19:48:25.531 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741848_1024, type=LAST_IN_PIPELINE terminating
19:48:25.531 INFO FSNamesystem - BLOCK* blk_1073741848_1024 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi
19:48:25.932 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:25.933 INFO IndexFileMerger - Done merging .sbi files
19:48:25.933 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.parts dst=null perm=null proto=rpc
19:48:25.943 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi dst=null perm=null proto=rpc
19:48:25.943 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi dst=null perm=null proto=rpc
19:48:25.944 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi dst=null perm=null proto=rpc
19:48:25.945 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:25.946 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.946 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.947 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.947 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.948 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.bai dst=null perm=null proto=rpc
19:48:25.949 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bai dst=null perm=null proto=rpc
19:48:25.950 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:25.952 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:25.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi dst=null perm=null proto=rpc
19:48:25.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi dst=null perm=null proto=rpc
19:48:25.953 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.sbi dst=null perm=null proto=rpc
19:48:25.954 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:25.954 INFO MemoryStore - Block broadcast_72 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
19:48:25.955 INFO MemoryStore - Block broadcast_72_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
19:48:25.955 INFO BlockManagerInfo - Added broadcast_72_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:48:25.956 INFO SparkContext - Created broadcast 72 from broadcast at BamSource.java:104
19:48:25.958 INFO MemoryStore - Block broadcast_73 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:25.968 INFO MemoryStore - Block broadcast_73_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:25.968 INFO BlockManagerInfo - Added broadcast_73_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:25.968 INFO SparkContext - Created broadcast 73 from newAPIHadoopFile at PathSplitSource.java:96
19:48:25.979 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.979 INFO FileInputFormat - Total input files to process : 1
19:48:25.979 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:25.995 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:25.995 INFO DAGScheduler - Got job 33 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:25.995 INFO DAGScheduler - Final stage: ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:25.995 INFO DAGScheduler - Parents of final stage: List()
19:48:25.995 INFO DAGScheduler - Missing parents: List()
19:48:25.996 INFO DAGScheduler - Submitting ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:26.006 INFO MemoryStore - Block broadcast_74 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
19:48:26.007 INFO MemoryStore - Block broadcast_74_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
19:48:26.007 INFO BlockManagerInfo - Added broadcast_74_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:26.007 INFO SparkContext - Created broadcast 74 from broadcast at DAGScheduler.scala:1580
19:48:26.008 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:26.008 INFO TaskSchedulerImpl - Adding task set 46.0 with 1 tasks resource profile 0
19:48:26.008 INFO TaskSetManager - Starting task 0.0 in stage 46.0 (TID 84) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:26.009 INFO Executor - Running task 0.0 in stage 46.0 (TID 84)
19:48:26.021 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam:0+237038
19:48:26.022 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:26.023 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:26.024 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.bai dst=null perm=null proto=rpc
19:48:26.025 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bai dst=null perm=null proto=rpc
19:48:26.026 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:26.029 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:26.032 INFO Executor - Finished task 0.0 in stage 46.0 (TID 84). 651526 bytes result sent to driver
19:48:26.035 INFO TaskSetManager - Finished task 0.0 in stage 46.0 (TID 84) in 27 ms on localhost (executor driver) (1/1)
19:48:26.035 INFO TaskSchedulerImpl - Removed TaskSet 46.0, whose tasks have all completed, from pool
19:48:26.035 INFO DAGScheduler - ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.039 s
19:48:26.035 INFO DAGScheduler - Job 33 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:26.035 INFO TaskSchedulerImpl - Killing all running tasks in stage 46: Stage finished
19:48:26.035 INFO DAGScheduler - Job 33 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.040552 s
19:48:26.048 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:26.049 INFO DAGScheduler - Got job 34 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:26.049 INFO DAGScheduler - Final stage: ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185)
19:48:26.049 INFO DAGScheduler - Parents of final stage: List()
19:48:26.049 INFO DAGScheduler - Missing parents: List()
19:48:26.049 INFO DAGScheduler - Submitting ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:26.076 INFO MemoryStore - Block broadcast_75 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:26.077 INFO MemoryStore - Block broadcast_75_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:48:26.078 INFO BlockManagerInfo - Added broadcast_75_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:26.078 INFO SparkContext - Created broadcast 75 from broadcast at DAGScheduler.scala:1580
19:48:26.078 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:26.078 INFO TaskSchedulerImpl - Adding task set 47.0 with 1 tasks resource profile 0
19:48:26.079 INFO TaskSetManager - Starting task 0.0 in stage 47.0 (TID 85) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:26.079 INFO Executor - Running task 0.0 in stage 47.0 (TID 85)
19:48:26.117 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:26.130 INFO Executor - Finished task 0.0 in stage 47.0 (TID 85). 989 bytes result sent to driver
19:48:26.131 INFO TaskSetManager - Finished task 0.0 in stage 47.0 (TID 85) in 52 ms on localhost (executor driver) (1/1)
19:48:26.131 INFO TaskSchedulerImpl - Removed TaskSet 47.0, whose tasks have all completed, from pool
19:48:26.131 INFO DAGScheduler - ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.081 s
19:48:26.132 INFO DAGScheduler - Job 34 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:26.132 INFO TaskSchedulerImpl - Killing all running tasks in stage 47: Stage finished
19:48:26.132 INFO DAGScheduler - Job 34 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.083754 s
19:48:26.136 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:26.137 INFO DAGScheduler - Got job 35 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:26.137 INFO DAGScheduler - Final stage: ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185)
19:48:26.137 INFO DAGScheduler - Parents of final stage: List()
19:48:26.137 INFO DAGScheduler - Missing parents: List()
19:48:26.137 INFO DAGScheduler - Submitting ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:26.147 INFO MemoryStore - Block broadcast_76 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:48:26.148 INFO MemoryStore - Block broadcast_76_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
19:48:26.148 INFO BlockManagerInfo - Added broadcast_76_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.4 MiB)
19:48:26.148 INFO SparkContext - Created broadcast 76 from broadcast at DAGScheduler.scala:1580
19:48:26.149 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:26.149 INFO TaskSchedulerImpl - Adding task set 48.0 with 1 tasks resource profile 0
19:48:26.149 INFO TaskSetManager - Starting task 0.0 in stage 48.0 (TID 86) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:26.150 INFO Executor - Running task 0.0 in stage 48.0 (TID 86)
19:48:26.163 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam:0+237038
19:48:26.163 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:26.164 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam dst=null perm=null proto=rpc
19:48:26.165 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bam.bai dst=null perm=null proto=rpc
19:48:26.166 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d2ba38a6-05c1-4996-8b79-8afeb9245143.bai dst=null perm=null proto=rpc
19:48:26.167 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:26.170 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:26.171 INFO Executor - Finished task 0.0 in stage 48.0 (TID 86). 989 bytes result sent to driver
19:48:26.172 INFO TaskSetManager - Finished task 0.0 in stage 48.0 (TID 86) in 23 ms on localhost (executor driver) (1/1)
19:48:26.172 INFO TaskSchedulerImpl - Removed TaskSet 48.0, whose tasks have all completed, from pool
19:48:26.172 INFO DAGScheduler - ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.035 s
19:48:26.172 INFO DAGScheduler - Job 35 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:26.172 INFO TaskSchedulerImpl - Killing all running tasks in stage 48: Stage finished
19:48:26.172 INFO DAGScheduler - Job 35 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.036006 s
19:48:26.177 INFO MemoryStore - Block broadcast_77 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
19:48:26.187 INFO MemoryStore - Block broadcast_77_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:48:26.187 INFO BlockManagerInfo - Added broadcast_77_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:26.187 INFO SparkContext - Created broadcast 77 from newAPIHadoopFile at PathSplitSource.java:96
19:48:26.221 INFO MemoryStore - Block broadcast_78 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:26.227 INFO MemoryStore - Block broadcast_78_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:26.228 INFO BlockManagerInfo - Added broadcast_78_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:26.228 INFO SparkContext - Created broadcast 78 from newAPIHadoopFile at PathSplitSource.java:96
19:48:26.249 INFO FileInputFormat - Total input files to process : 1
19:48:26.251 INFO MemoryStore - Block broadcast_79 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
19:48:26.252 INFO MemoryStore - Block broadcast_79_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
19:48:26.252 INFO BlockManagerInfo - Added broadcast_79_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:26.252 INFO SparkContext - Created broadcast 79 from broadcast at ReadsSparkSink.java:133
19:48:26.253 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:26.253 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:26.254 INFO MemoryStore - Block broadcast_80 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
19:48:26.255 INFO MemoryStore - Block broadcast_80_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
19:48:26.256 INFO BlockManagerInfo - Added broadcast_80_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:26.256 INFO SparkContext - Created broadcast 80 from broadcast at BamSink.java:76
19:48:26.258 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts dst=null perm=null proto=rpc
19:48:26.259 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:26.259 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:26.259 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:26.260 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:26.266 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:26.266 INFO DAGScheduler - Registering RDD 178 (mapToPair at SparkUtils.java:161) as input to shuffle 11
19:48:26.267 INFO DAGScheduler - Got job 36 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:26.267 INFO DAGScheduler - Final stage: ResultStage 50 (runJob at SparkHadoopWriter.scala:83)
19:48:26.267 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 49)
19:48:26.267 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 49)
19:48:26.267 INFO DAGScheduler - Submitting ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:26.285 INFO MemoryStore - Block broadcast_81 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
19:48:26.286 INFO MemoryStore - Block broadcast_81_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
19:48:26.287 INFO BlockManagerInfo - Added broadcast_81_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.1 MiB)
19:48:26.287 INFO SparkContext - Created broadcast 81 from broadcast at DAGScheduler.scala:1580
19:48:26.287 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:26.287 INFO TaskSchedulerImpl - Adding task set 49.0 with 1 tasks resource profile 0
19:48:26.288 INFO TaskSetManager - Starting task 0.0 in stage 49.0 (TID 87) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:26.288 INFO Executor - Running task 0.0 in stage 49.0 (TID 87)
19:48:26.321 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:26.339 INFO Executor - Finished task 0.0 in stage 49.0 (TID 87). 1148 bytes result sent to driver
19:48:26.339 INFO TaskSetManager - Finished task 0.0 in stage 49.0 (TID 87) in 51 ms on localhost (executor driver) (1/1)
19:48:26.339 INFO TaskSchedulerImpl - Removed TaskSet 49.0, whose tasks have all completed, from pool
19:48:26.340 INFO DAGScheduler - ShuffleMapStage 49 (mapToPair at SparkUtils.java:161) finished in 0.073 s
19:48:26.340 INFO DAGScheduler - looking for newly runnable stages
19:48:26.340 INFO DAGScheduler - running: HashSet()
19:48:26.340 INFO DAGScheduler - waiting: HashSet(ResultStage 50)
19:48:26.340 INFO DAGScheduler - failed: HashSet()
19:48:26.340 INFO DAGScheduler - Submitting ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91), which has no missing parents
19:48:26.351 INFO MemoryStore - Block broadcast_82 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
19:48:26.352 INFO MemoryStore - Block broadcast_82_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.1 MiB)
19:48:26.353 INFO BlockManagerInfo - Added broadcast_82_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.1 MiB)
19:48:26.353 INFO SparkContext - Created broadcast 82 from broadcast at DAGScheduler.scala:1580
19:48:26.353 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:26.353 INFO TaskSchedulerImpl - Adding task set 50.0 with 1 tasks resource profile 0
19:48:26.354 INFO TaskSetManager - Starting task 0.0 in stage 50.0 (TID 88) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:26.354 INFO Executor - Running task 0.0 in stage 50.0 (TID 88)
19:48:26.362 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:26.363 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:26.384 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:26.384 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:26.384 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:26.384 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:26.384 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:26.384 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:26.385 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:26.388 INFO StateChange - BLOCK* allocate blk_1073741849_1025, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0/part-r-00000
19:48:26.390 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741849_1025 src: /127.0.0.1:46104 dest: /127.0.0.1:45925
19:48:26.393 INFO clienttrace - src: /127.0.0.1:46104, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741849_1025, duration(ns): 2331579
19:48:26.393 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741849_1025, type=LAST_IN_PIPELINE terminating
19:48:26.394 INFO FSNamesystem - BLOCK* blk_1073741849_1025 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0/part-r-00000
19:48:26.795 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:26.795 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:26.796 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0 dst=null perm=null proto=rpc
19:48:26.797 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0 dst=null perm=null proto=rpc
19:48:26.798 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/task_20250715194826715523312090830665_0183_r_000000 dst=null perm=null proto=rpc
19:48:26.799 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/_temporary/attempt_20250715194826715523312090830665_0183_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/task_20250715194826715523312090830665_0183_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:26.799 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194826715523312090830665_0183_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/task_20250715194826715523312090830665_0183_r_000000
19:48:26.799 INFO SparkHadoopMapRedUtil - attempt_20250715194826715523312090830665_0183_r_000000_0: Committed. Elapsed time: 2 ms.
19:48:26.800 INFO Executor - Finished task 0.0 in stage 50.0 (TID 88). 1858 bytes result sent to driver
19:48:26.802 INFO TaskSetManager - Finished task 0.0 in stage 50.0 (TID 88) in 448 ms on localhost (executor driver) (1/1)
19:48:26.802 INFO TaskSchedulerImpl - Removed TaskSet 50.0, whose tasks have all completed, from pool
19:48:26.802 INFO DAGScheduler - ResultStage 50 (runJob at SparkHadoopWriter.scala:83) finished in 0.462 s
19:48:26.802 INFO DAGScheduler - Job 36 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:26.802 INFO TaskSchedulerImpl - Killing all running tasks in stage 50: Stage finished
19:48:26.802 INFO DAGScheduler - Job 36 finished: runJob at SparkHadoopWriter.scala:83, took 0.536440 s
19:48:26.803 INFO SparkHadoopWriter - Start to commit write Job job_20250715194826715523312090830665_0183.
19:48:26.804 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:26.804 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts dst=null perm=null proto=rpc
19:48:26.805 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/task_20250715194826715523312090830665_0183_r_000000 dst=null perm=null proto=rpc
19:48:26.806 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:26.806 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary/0/task_20250715194826715523312090830665_0183_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:26.807 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:26.808 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:26.809 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:26.810 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/.spark-staging-183 dst=null perm=null proto=rpc
19:48:26.810 INFO SparkHadoopWriter - Write Job job_20250715194826715523312090830665_0183 committed. Elapsed time: 6 ms.
19:48:26.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:26.812 INFO StateChange - BLOCK* allocate blk_1073741850_1026, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/header
19:48:26.813 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741850_1026 src: /127.0.0.1:46116 dest: /127.0.0.1:45925
19:48:26.814 INFO clienttrace - src: /127.0.0.1:46116, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741850_1026, duration(ns): 494995
19:48:26.814 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741850_1026, type=LAST_IN_PIPELINE terminating
19:48:26.815 INFO FSNamesystem - BLOCK* blk_1073741850_1026 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/header
19:48:27.216 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:27.217 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:27.218 INFO StateChange - BLOCK* allocate blk_1073741851_1027, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/terminator
19:48:27.219 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741851_1027 src: /127.0.0.1:49894 dest: /127.0.0.1:45925
19:48:27.221 INFO clienttrace - src: /127.0.0.1:49894, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741851_1027, duration(ns): 521688
19:48:27.221 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741851_1027, type=LAST_IN_PIPELINE terminating
19:48:27.221 INFO FSNamesystem - BLOCK* blk_1073741851_1027 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/terminator
19:48:27.622 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:27.623 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts dst=null perm=null proto=rpc
19:48:27.624 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:27.625 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:27.625 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam
19:48:27.626 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:27.626 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.627 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:27.627 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam done
19:48:27.628 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.628 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.parts dst=null perm=null proto=rpc
19:48:27.629 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.631 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.632 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.bai dst=null perm=null proto=rpc
19:48:27.632 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bai dst=null perm=null proto=rpc
19:48:27.634 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:27.635 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.636 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.sbi dst=null perm=null proto=rpc
19:48:27.638 INFO MemoryStore - Block broadcast_83 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
19:48:27.649 INFO MemoryStore - Block broadcast_83_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1914.7 MiB)
19:48:27.649 INFO BlockManagerInfo - Added broadcast_83_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.0 MiB)
19:48:27.649 INFO SparkContext - Created broadcast 83 from newAPIHadoopFile at PathSplitSource.java:96
19:48:27.676 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.677 INFO FileInputFormat - Total input files to process : 1
19:48:27.677 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.713 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:27.714 INFO DAGScheduler - Got job 37 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:27.714 INFO DAGScheduler - Final stage: ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:27.714 INFO DAGScheduler - Parents of final stage: List()
19:48:27.714 INFO DAGScheduler - Missing parents: List()
19:48:27.714 INFO DAGScheduler - Submitting ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:27.747 INFO MemoryStore - Block broadcast_84 stored as values in memory (estimated size 426.2 KiB, free 1914.4 MiB)
19:48:27.747 INFO BlockManagerInfo - Removed broadcast_73_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.1 MiB)
19:48:27.748 INFO BlockManagerInfo - Removed broadcast_78_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.1 MiB)
19:48:27.749 INFO MemoryStore - Block broadcast_84_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1914.9 MiB)
19:48:27.749 INFO BlockManagerInfo - Removed broadcast_74_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.2 MiB)
19:48:27.749 INFO BlockManagerInfo - Added broadcast_84_piece0 in memory on localhost:36125 (size: 153.7 KiB, free: 1919.0 MiB)
19:48:27.749 INFO SparkContext - Created broadcast 84 from broadcast at DAGScheduler.scala:1580
19:48:27.749 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:27.750 INFO TaskSchedulerImpl - Adding task set 51.0 with 1 tasks resource profile 0
19:48:27.750 INFO BlockManagerInfo - Removed broadcast_70_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.2 MiB)
19:48:27.750 INFO TaskSetManager - Starting task 0.0 in stage 51.0 (TID 89) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:27.751 INFO Executor - Running task 0.0 in stage 51.0 (TID 89)
19:48:27.751 INFO BlockManagerInfo - Removed broadcast_75_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:27.752 INFO BlockManagerInfo - Removed broadcast_72_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.3 MiB)
19:48:27.753 INFO BlockManagerInfo - Removed broadcast_79_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:48:27.755 INFO BlockManagerInfo - Removed broadcast_71_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.4 MiB)
19:48:27.756 INFO BlockManagerInfo - Removed broadcast_81_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.6 MiB)
19:48:27.756 INFO BlockManagerInfo - Removed broadcast_68_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:27.757 INFO BlockManagerInfo - Removed broadcast_69_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:27.759 INFO BlockManagerInfo - Removed broadcast_82_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.6 MiB)
19:48:27.759 INFO BlockManagerInfo - Removed broadcast_76_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.7 MiB)
19:48:27.760 INFO BlockManagerInfo - Removed broadcast_80_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.7 MiB)
19:48:27.761 INFO BlockManagerInfo - Removed broadcast_66_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:27.786 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam:0+237038
19:48:27.787 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.788 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.789 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:27.790 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.790 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.791 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.bai dst=null perm=null proto=rpc
19:48:27.792 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bai dst=null perm=null proto=rpc
19:48:27.793 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:27.795 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.795 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.802 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.803 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.804 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.805 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.806 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.807 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.808 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.809 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.810 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.811 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.812 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.813 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.814 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.815 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.818 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.818 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.819 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.820 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.821 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.821 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.822 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.823 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.824 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.824 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.825 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.826 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.828 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.829 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.830 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.831 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.832 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.833 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.834 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.836 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.837 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.838 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.839 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.840 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.841 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.841 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.842 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.843 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.844 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.844 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.845 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.846 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.846 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.847 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.848 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.849 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.849 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.850 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.851 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.852 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:27.853 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:27.854 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.855 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:27.856 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.bai dst=null perm=null proto=rpc
19:48:27.857 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bai dst=null perm=null proto=rpc
19:48:27.858 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:27.862 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:27.866 INFO Executor - Finished task 0.0 in stage 51.0 (TID 89). 651526 bytes result sent to driver
19:48:27.868 INFO TaskSetManager - Finished task 0.0 in stage 51.0 (TID 89) in 118 ms on localhost (executor driver) (1/1)
19:48:27.869 INFO TaskSchedulerImpl - Removed TaskSet 51.0, whose tasks have all completed, from pool
19:48:27.869 INFO DAGScheduler - ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.154 s
19:48:27.869 INFO DAGScheduler - Job 37 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:27.869 INFO TaskSchedulerImpl - Killing all running tasks in stage 51: Stage finished
19:48:27.869 INFO DAGScheduler - Job 37 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.155581 s
19:48:27.881 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:27.881 INFO DAGScheduler - Got job 38 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:27.881 INFO DAGScheduler - Final stage: ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185)
19:48:27.881 INFO DAGScheduler - Parents of final stage: List()
19:48:27.881 INFO DAGScheduler - Missing parents: List()
19:48:27.881 INFO DAGScheduler - Submitting ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:27.902 INFO MemoryStore - Block broadcast_85 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
19:48:27.904 INFO MemoryStore - Block broadcast_85_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
19:48:27.904 INFO BlockManagerInfo - Added broadcast_85_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.6 MiB)
19:48:27.904 INFO SparkContext - Created broadcast 85 from broadcast at DAGScheduler.scala:1580
19:48:27.904 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:27.904 INFO TaskSchedulerImpl - Adding task set 52.0 with 1 tasks resource profile 0
19:48:27.905 INFO TaskSetManager - Starting task 0.0 in stage 52.0 (TID 90) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:27.905 INFO Executor - Running task 0.0 in stage 52.0 (TID 90)
19:48:27.940 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:27.951 INFO Executor - Finished task 0.0 in stage 52.0 (TID 90). 989 bytes result sent to driver
19:48:27.952 INFO TaskSetManager - Finished task 0.0 in stage 52.0 (TID 90) in 47 ms on localhost (executor driver) (1/1)
19:48:27.952 INFO TaskSchedulerImpl - Removed TaskSet 52.0, whose tasks have all completed, from pool
19:48:27.952 INFO DAGScheduler - ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.070 s
19:48:27.952 INFO DAGScheduler - Job 38 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:27.952 INFO TaskSchedulerImpl - Killing all running tasks in stage 52: Stage finished
19:48:27.952 INFO DAGScheduler - Job 38 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.071703 s
19:48:27.956 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:27.956 INFO DAGScheduler - Got job 39 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:27.956 INFO DAGScheduler - Final stage: ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185)
19:48:27.957 INFO DAGScheduler - Parents of final stage: List()
19:48:27.957 INFO DAGScheduler - Missing parents: List()
19:48:27.957 INFO DAGScheduler - Submitting ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:27.987 INFO MemoryStore - Block broadcast_86 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
19:48:27.988 INFO MemoryStore - Block broadcast_86_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
19:48:27.989 INFO BlockManagerInfo - Added broadcast_86_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:27.989 INFO SparkContext - Created broadcast 86 from broadcast at DAGScheduler.scala:1580
19:48:27.989 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:27.989 INFO TaskSchedulerImpl - Adding task set 53.0 with 1 tasks resource profile 0
19:48:27.990 INFO TaskSetManager - Starting task 0.0 in stage 53.0 (TID 91) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:27.990 INFO Executor - Running task 0.0 in stage 53.0 (TID 91)
19:48:28.027 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam:0+237038
19:48:28.028 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.029 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.030 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:28.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.032 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.bai dst=null perm=null proto=rpc
19:48:28.033 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bai dst=null perm=null proto=rpc
19:48:28.034 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:28.036 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.036 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.037 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.038 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:28.044 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.045 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.046 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.046 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.047 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.048 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.048 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.049 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.050 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.051 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.052 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.052 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.053 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.054 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.055 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.056 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.057 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.058 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.059 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.061 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.062 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.063 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.064 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.065 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.066 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.066 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.067 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.068 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.070 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.071 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.072 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.072 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.073 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.074 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.074 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.075 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.076 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.077 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.077 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.078 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.079 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.080 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.081 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.082 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.083 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.083 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.084 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.085 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.086 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.087 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.088 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.089 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.090 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.091 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.091 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.092 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.093 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.094 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:28.095 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.095 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam dst=null perm=null proto=rpc
19:48:28.097 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bam.bai dst=null perm=null proto=rpc
19:48:28.098 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_6cc4b124-6b5b-4fa8-b54c-84cc283c2d4e.bai dst=null perm=null proto=rpc
19:48:28.099 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:28.102 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:28.102 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:28.105 INFO Executor - Finished task 0.0 in stage 53.0 (TID 91). 989 bytes result sent to driver
19:48:28.105 INFO TaskSetManager - Finished task 0.0 in stage 53.0 (TID 91) in 115 ms on localhost (executor driver) (1/1)
19:48:28.105 INFO TaskSchedulerImpl - Removed TaskSet 53.0, whose tasks have all completed, from pool
19:48:28.105 INFO DAGScheduler - ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.148 s
19:48:28.105 INFO DAGScheduler - Job 39 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:28.105 INFO TaskSchedulerImpl - Killing all running tasks in stage 53: Stage finished
19:48:28.106 INFO DAGScheduler - Job 39 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.149472 s
19:48:28.110 INFO MemoryStore - Block broadcast_87 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
19:48:28.116 INFO MemoryStore - Block broadcast_87_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
19:48:28.117 INFO BlockManagerInfo - Added broadcast_87_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.4 MiB)
19:48:28.117 INFO SparkContext - Created broadcast 87 from newAPIHadoopFile at PathSplitSource.java:96
19:48:28.144 INFO MemoryStore - Block broadcast_88 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
19:48:28.150 INFO MemoryStore - Block broadcast_88_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.9 MiB)
19:48:28.150 INFO BlockManagerInfo - Added broadcast_88_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.4 MiB)
19:48:28.150 INFO SparkContext - Created broadcast 88 from newAPIHadoopFile at PathSplitSource.java:96
19:48:28.172 INFO FileInputFormat - Total input files to process : 1
19:48:28.174 INFO MemoryStore - Block broadcast_89 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
19:48:28.176 INFO MemoryStore - Block broadcast_89_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
19:48:28.176 INFO BlockManagerInfo - Added broadcast_89_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:28.176 INFO SparkContext - Created broadcast 89 from broadcast at ReadsSparkSink.java:133
19:48:28.177 INFO MemoryStore - Block broadcast_90 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
19:48:28.178 INFO MemoryStore - Block broadcast_90_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
19:48:28.179 INFO BlockManagerInfo - Added broadcast_90_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:28.179 INFO SparkContext - Created broadcast 90 from broadcast at BamSink.java:76
19:48:28.181 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts dst=null perm=null proto=rpc
19:48:28.182 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:28.182 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:28.182 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:28.183 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:28.193 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:28.194 INFO DAGScheduler - Registering RDD 204 (mapToPair at SparkUtils.java:161) as input to shuffle 12
19:48:28.195 INFO DAGScheduler - Got job 40 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:28.195 INFO DAGScheduler - Final stage: ResultStage 55 (runJob at SparkHadoopWriter.scala:83)
19:48:28.195 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 54)
19:48:28.195 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 54)
19:48:28.195 INFO DAGScheduler - Submitting ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:28.213 INFO MemoryStore - Block broadcast_91 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
19:48:28.214 INFO MemoryStore - Block broadcast_91_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
19:48:28.215 INFO BlockManagerInfo - Added broadcast_91_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:28.215 INFO SparkContext - Created broadcast 91 from broadcast at DAGScheduler.scala:1580
19:48:28.215 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:28.215 INFO TaskSchedulerImpl - Adding task set 54.0 with 1 tasks resource profile 0
19:48:28.216 INFO TaskSetManager - Starting task 0.0 in stage 54.0 (TID 92) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
19:48:28.216 INFO Executor - Running task 0.0 in stage 54.0 (TID 92)
19:48:28.248 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:28.270 INFO Executor - Finished task 0.0 in stage 54.0 (TID 92). 1148 bytes result sent to driver
19:48:28.270 INFO TaskSetManager - Finished task 0.0 in stage 54.0 (TID 92) in 54 ms on localhost (executor driver) (1/1)
19:48:28.270 INFO TaskSchedulerImpl - Removed TaskSet 54.0, whose tasks have all completed, from pool
19:48:28.271 INFO DAGScheduler - ShuffleMapStage 54 (mapToPair at SparkUtils.java:161) finished in 0.076 s
19:48:28.271 INFO DAGScheduler - looking for newly runnable stages
19:48:28.271 INFO DAGScheduler - running: HashSet()
19:48:28.271 INFO DAGScheduler - waiting: HashSet(ResultStage 55)
19:48:28.271 INFO DAGScheduler - failed: HashSet()
19:48:28.271 INFO DAGScheduler - Submitting ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91), which has no missing parents
19:48:28.280 INFO MemoryStore - Block broadcast_92 stored as values in memory (estimated size 241.5 KiB, free 1915.7 MiB)
19:48:28.281 INFO MemoryStore - Block broadcast_92_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.6 MiB)
19:48:28.281 INFO BlockManagerInfo - Added broadcast_92_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.1 MiB)
19:48:28.282 INFO SparkContext - Created broadcast 92 from broadcast at DAGScheduler.scala:1580
19:48:28.282 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:28.282 INFO TaskSchedulerImpl - Adding task set 55.0 with 1 tasks resource profile 0
19:48:28.283 INFO TaskSetManager - Starting task 0.0 in stage 55.0 (TID 93) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:28.283 INFO Executor - Running task 0.0 in stage 55.0 (TID 93)
19:48:28.288 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:28.288 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:28.303 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:28.303 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:28.303 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:28.303 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:28.303 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:28.303 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:28.305 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:28.306 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:28.307 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:28.312 INFO StateChange - BLOCK* allocate blk_1073741852_1028, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/part-r-00000
19:48:28.313 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741852_1028 src: /127.0.0.1:50618 dest: /127.0.0.1:45925
19:48:28.317 INFO clienttrace - src: /127.0.0.1:50618, dest: /127.0.0.1:45925, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741852_1028, duration(ns): 2914822
19:48:28.317 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741852_1028, type=LAST_IN_PIPELINE terminating
19:48:28.317 INFO FSNamesystem - BLOCK* blk_1073741852_1028 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/part-r-00000
19:48:28.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741845_1021 replica FinalizedReplica, blk_1073741845_1021, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741845 for deletion
19:48:28.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741845_1021 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741845
19:48:28.718 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:28.719 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:28.720 INFO StateChange - BLOCK* allocate blk_1073741853_1029, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/.part-r-00000.sbi
19:48:28.721 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741853_1029 src: /127.0.0.1:50632 dest: /127.0.0.1:45925
19:48:28.722 INFO clienttrace - src: /127.0.0.1:50632, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741853_1029, duration(ns): 477645
19:48:28.722 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741853_1029, type=LAST_IN_PIPELINE terminating
19:48:28.723 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:28.725 INFO StateChange - BLOCK* allocate blk_1073741854_1030, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/.part-r-00000.bai
19:48:28.726 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741854_1030 src: /127.0.0.1:50642 dest: /127.0.0.1:45925
19:48:28.727 INFO clienttrace - src: /127.0.0.1:50642, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741854_1030, duration(ns): 437661
19:48:28.727 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741854_1030, type=LAST_IN_PIPELINE terminating
19:48:28.727 INFO FSNamesystem - BLOCK* blk_1073741854_1030 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/.part-r-00000.bai
19:48:29.128 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:29.129 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0 dst=null perm=null proto=rpc
19:48:29.130 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0 dst=null perm=null proto=rpc
19:48:29.131 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/task_202507151948281306789262945772470_0209_r_000000 dst=null perm=null proto=rpc
19:48:29.131 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/_temporary/attempt_202507151948281306789262945772470_0209_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/task_202507151948281306789262945772470_0209_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:29.131 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948281306789262945772470_0209_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/task_202507151948281306789262945772470_0209_r_000000
19:48:29.132 INFO SparkHadoopMapRedUtil - attempt_202507151948281306789262945772470_0209_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:29.132 INFO Executor - Finished task 0.0 in stage 55.0 (TID 93). 1858 bytes result sent to driver
19:48:29.133 INFO TaskSetManager - Finished task 0.0 in stage 55.0 (TID 93) in 850 ms on localhost (executor driver) (1/1)
19:48:29.133 INFO TaskSchedulerImpl - Removed TaskSet 55.0, whose tasks have all completed, from pool
19:48:29.133 INFO DAGScheduler - ResultStage 55 (runJob at SparkHadoopWriter.scala:83) finished in 0.862 s
19:48:29.133 INFO DAGScheduler - Job 40 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:29.133 INFO TaskSchedulerImpl - Killing all running tasks in stage 55: Stage finished
19:48:29.134 INFO DAGScheduler - Job 40 finished: runJob at SparkHadoopWriter.scala:83, took 0.940009 s
19:48:29.134 INFO SparkHadoopWriter - Start to commit write Job job_202507151948281306789262945772470_0209.
19:48:29.135 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:29.135 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts dst=null perm=null proto=rpc
19:48:29.136 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/task_202507151948281306789262945772470_0209_r_000000 dst=null perm=null proto=rpc
19:48:29.137 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:29.137 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/task_202507151948281306789262945772470_0209_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.138 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:29.138 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/task_202507151948281306789262945772470_0209_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.139 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:29.140 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary/0/task_202507151948281306789262945772470_0209_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.140 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:29.141 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.142 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:29.143 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.spark-staging-209 dst=null perm=null proto=rpc
19:48:29.143 INFO SparkHadoopWriter - Write Job job_202507151948281306789262945772470_0209 committed. Elapsed time: 8 ms.
19:48:29.144 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.145 INFO StateChange - BLOCK* allocate blk_1073741855_1031, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/header
19:48:29.147 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741855_1031 src: /127.0.0.1:50656 dest: /127.0.0.1:45925
19:48:29.148 INFO clienttrace - src: /127.0.0.1:50656, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741855_1031, duration(ns): 783518
19:48:29.149 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741855_1031, type=LAST_IN_PIPELINE terminating
19:48:29.149 INFO FSNamesystem - BLOCK* blk_1073741855_1031 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/header
19:48:29.550 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:29.551 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.552 INFO StateChange - BLOCK* allocate blk_1073741856_1032, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/terminator
19:48:29.553 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741856_1032 src: /127.0.0.1:50660 dest: /127.0.0.1:45925
19:48:29.554 INFO clienttrace - src: /127.0.0.1:50660, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741856_1032, duration(ns): 428694
19:48:29.554 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741856_1032, type=LAST_IN_PIPELINE terminating
19:48:29.555 INFO FSNamesystem - BLOCK* blk_1073741856_1032 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/terminator
19:48:29.956 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:29.957 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts dst=null perm=null proto=rpc
19:48:29.958 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.958 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:29.959 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam
19:48:29.959 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.960 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:29.960 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.960 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam done
19:48:29.961 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:29.961 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi
19:48:29.961 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts dst=null perm=null proto=rpc
19:48:29.962 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:29.963 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:29.964 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:29.965 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:29.965 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:29.966 INFO StateChange - BLOCK* allocate blk_1073741857_1033, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi
19:48:29.967 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741857_1033 src: /127.0.0.1:50674 dest: /127.0.0.1:45925
19:48:29.968 INFO clienttrace - src: /127.0.0.1:50674, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741857_1033, duration(ns): 423019
19:48:29.968 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741857_1033, type=LAST_IN_PIPELINE terminating
19:48:29.969 INFO FSNamesystem - BLOCK* blk_1073741857_1033 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi
19:48:30.370 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:30.370 INFO IndexFileMerger - Done merging .sbi files
19:48:30.370 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai
19:48:30.371 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts dst=null perm=null proto=rpc
19:48:30.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:30.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:30.373 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:30.374 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:30.374 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:30.379 INFO StateChange - BLOCK* allocate blk_1073741858_1034, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai
19:48:30.380 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741858_1034 src: /127.0.0.1:50686 dest: /127.0.0.1:45925
19:48:30.381 INFO clienttrace - src: /127.0.0.1:50686, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741858_1034, duration(ns): 491338
19:48:30.381 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741858_1034, type=LAST_IN_PIPELINE terminating
19:48:30.382 INFO FSNamesystem - BLOCK* blk_1073741858_1034 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai
19:48:30.783 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:30.783 INFO IndexFileMerger - Done merging .bai files
19:48:30.783 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.parts dst=null perm=null proto=rpc
19:48:30.793 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:30.805 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi dst=null perm=null proto=rpc
19:48:30.807 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi dst=null perm=null proto=rpc
19:48:30.807 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi dst=null perm=null proto=rpc
19:48:30.809 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:30.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:30.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:30.814 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:30.816 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:30.816 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:30.817 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
19:48:30.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi dst=null perm=null proto=rpc
19:48:30.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi dst=null perm=null proto=rpc
19:48:30.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.sbi dst=null perm=null proto=rpc
19:48:30.819 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:30.819 INFO MemoryStore - Block broadcast_93 stored as values in memory (estimated size 320.0 B, free 1915.6 MiB)
19:48:30.829 INFO MemoryStore - Block broadcast_93_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.6 MiB)
19:48:30.830 INFO BlockManagerInfo - Added broadcast_93_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.1 MiB)
19:48:30.830 INFO SparkContext - Created broadcast 93 from broadcast at BamSource.java:104
19:48:30.830 INFO BlockManagerInfo - Removed broadcast_92_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.2 MiB)
19:48:30.831 INFO BlockManagerInfo - Removed broadcast_88_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.2 MiB)
19:48:30.831 INFO BlockManagerInfo - Removed broadcast_83_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:30.832 INFO MemoryStore - Block broadcast_94 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
19:48:30.832 INFO BlockManagerInfo - Removed broadcast_86_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.4 MiB)
19:48:30.833 INFO BlockManagerInfo - Removed broadcast_84_piece0 on localhost:36125 in memory (size: 153.7 KiB, free: 1919.6 MiB)
19:48:30.834 INFO BlockManagerInfo - Removed broadcast_89_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:30.834 INFO BlockManagerInfo - Removed broadcast_90_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:30.835 INFO BlockManagerInfo - Removed broadcast_91_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:30.836 INFO BlockManagerInfo - Removed broadcast_77_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:30.837 INFO BlockManagerInfo - Removed broadcast_85_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1920.0 MiB)
19:48:30.841 INFO MemoryStore - Block broadcast_94_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:30.841 INFO BlockManagerInfo - Added broadcast_94_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:30.841 INFO SparkContext - Created broadcast 94 from newAPIHadoopFile at PathSplitSource.java:96
19:48:30.853 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.853 INFO FileInputFormat - Total input files to process : 1
19:48:30.853 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.868 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:30.869 INFO DAGScheduler - Got job 41 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:30.869 INFO DAGScheduler - Final stage: ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:30.869 INFO DAGScheduler - Parents of final stage: List()
19:48:30.869 INFO DAGScheduler - Missing parents: List()
19:48:30.869 INFO DAGScheduler - Submitting ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:30.875 INFO MemoryStore - Block broadcast_95 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:30.876 INFO MemoryStore - Block broadcast_95_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:30.876 INFO BlockManagerInfo - Added broadcast_95_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:30.877 INFO SparkContext - Created broadcast 95 from broadcast at DAGScheduler.scala:1580
19:48:30.877 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:30.877 INFO TaskSchedulerImpl - Adding task set 56.0 with 1 tasks resource profile 0
19:48:30.878 INFO TaskSetManager - Starting task 0.0 in stage 56.0 (TID 94) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:30.878 INFO Executor - Running task 0.0 in stage 56.0 (TID 94)
19:48:30.890 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam:0+235514
19:48:30.891 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.892 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:30.893 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:30.893 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:30.894 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:30.895 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:30.897 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:30.900 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:30.902 INFO Executor - Finished task 0.0 in stage 56.0 (TID 94). 650184 bytes result sent to driver
19:48:30.904 INFO TaskSetManager - Finished task 0.0 in stage 56.0 (TID 94) in 27 ms on localhost (executor driver) (1/1)
19:48:30.904 INFO TaskSchedulerImpl - Removed TaskSet 56.0, whose tasks have all completed, from pool
19:48:30.905 INFO DAGScheduler - ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.036 s
19:48:30.905 INFO DAGScheduler - Job 41 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:30.905 INFO TaskSchedulerImpl - Killing all running tasks in stage 56: Stage finished
19:48:30.905 INFO DAGScheduler - Job 41 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.036372 s
19:48:30.920 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:30.921 INFO DAGScheduler - Got job 42 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:30.921 INFO DAGScheduler - Final stage: ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185)
19:48:30.921 INFO DAGScheduler - Parents of final stage: List()
19:48:30.921 INFO DAGScheduler - Missing parents: List()
19:48:30.921 INFO DAGScheduler - Submitting ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:30.939 INFO MemoryStore - Block broadcast_96 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:30.940 INFO MemoryStore - Block broadcast_96_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
19:48:30.940 INFO BlockManagerInfo - Added broadcast_96_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:30.941 INFO SparkContext - Created broadcast 96 from broadcast at DAGScheduler.scala:1580
19:48:30.941 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:30.941 INFO TaskSchedulerImpl - Adding task set 57.0 with 1 tasks resource profile 0
19:48:30.941 INFO TaskSetManager - Starting task 0.0 in stage 57.0 (TID 95) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
19:48:30.942 INFO Executor - Running task 0.0 in stage 57.0 (TID 95)
19:48:30.972 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:30.984 INFO Executor - Finished task 0.0 in stage 57.0 (TID 95). 989 bytes result sent to driver
19:48:30.985 INFO TaskSetManager - Finished task 0.0 in stage 57.0 (TID 95) in 44 ms on localhost (executor driver) (1/1)
19:48:30.985 INFO TaskSchedulerImpl - Removed TaskSet 57.0, whose tasks have all completed, from pool
19:48:30.985 INFO DAGScheduler - ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.064 s
19:48:30.985 INFO DAGScheduler - Job 42 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:30.985 INFO TaskSchedulerImpl - Killing all running tasks in stage 57: Stage finished
19:48:30.985 INFO DAGScheduler - Job 42 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.064739 s
19:48:30.989 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:30.989 INFO DAGScheduler - Got job 43 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:30.989 INFO DAGScheduler - Final stage: ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185)
19:48:30.989 INFO DAGScheduler - Parents of final stage: List()
19:48:30.989 INFO DAGScheduler - Missing parents: List()
19:48:30.989 INFO DAGScheduler - Submitting ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:30.995 INFO MemoryStore - Block broadcast_97 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:30.996 INFO MemoryStore - Block broadcast_97_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
19:48:30.996 INFO BlockManagerInfo - Added broadcast_97_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:30.997 INFO SparkContext - Created broadcast 97 from broadcast at DAGScheduler.scala:1580
19:48:30.997 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:30.997 INFO TaskSchedulerImpl - Adding task set 58.0 with 1 tasks resource profile 0
19:48:30.998 INFO TaskSetManager - Starting task 0.0 in stage 58.0 (TID 96) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:30.998 INFO Executor - Running task 0.0 in stage 58.0 (TID 96)
19:48:31.010 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam:0+235514
19:48:31.011 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:31.012 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam dst=null perm=null proto=rpc
19:48:31.013 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:31.013 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:31.014 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_e0aec640-6af8-4d7a-82c9-814391110bd8.bam.bai dst=null perm=null proto=rpc
19:48:31.016 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:31.017 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:31.018 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:31.020 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
19:48:31.020 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:31.021 INFO Executor - Finished task 0.0 in stage 58.0 (TID 96). 989 bytes result sent to driver
19:48:31.022 INFO TaskSetManager - Finished task 0.0 in stage 58.0 (TID 96) in 25 ms on localhost (executor driver) (1/1)
19:48:31.022 INFO TaskSchedulerImpl - Removed TaskSet 58.0, whose tasks have all completed, from pool
19:48:31.022 INFO DAGScheduler - ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.032 s
19:48:31.022 INFO DAGScheduler - Job 43 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:31.022 INFO TaskSchedulerImpl - Killing all running tasks in stage 58: Stage finished
19:48:31.022 INFO DAGScheduler - Job 43 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.033636 s
19:48:31.026 INFO MemoryStore - Block broadcast_98 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
19:48:31.032 INFO MemoryStore - Block broadcast_98_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:31.032 INFO BlockManagerInfo - Added broadcast_98_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:31.033 INFO SparkContext - Created broadcast 98 from newAPIHadoopFile at PathSplitSource.java:96
19:48:31.060 INFO MemoryStore - Block broadcast_99 stored as values in memory (estimated size 298.0 KiB, free 1917.7 MiB)
19:48:31.067 INFO MemoryStore - Block broadcast_99_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:31.067 INFO BlockManagerInfo - Added broadcast_99_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:31.067 INFO SparkContext - Created broadcast 99 from newAPIHadoopFile at PathSplitSource.java:96
19:48:31.088 INFO FileInputFormat - Total input files to process : 1
19:48:31.089 INFO MemoryStore - Block broadcast_100 stored as values in memory (estimated size 19.6 KiB, free 1917.7 MiB)
19:48:31.090 INFO MemoryStore - Block broadcast_100_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.7 MiB)
19:48:31.090 INFO BlockManagerInfo - Added broadcast_100_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.5 MiB)
19:48:31.090 INFO SparkContext - Created broadcast 100 from broadcast at ReadsSparkSink.java:133
19:48:31.091 INFO MemoryStore - Block broadcast_101 stored as values in memory (estimated size 20.0 KiB, free 1917.6 MiB)
19:48:31.091 INFO MemoryStore - Block broadcast_101_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.6 MiB)
19:48:31.092 INFO BlockManagerInfo - Added broadcast_101_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.5 MiB)
19:48:31.092 INFO SparkContext - Created broadcast 101 from broadcast at BamSink.java:76
19:48:31.094 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts dst=null perm=null proto=rpc
19:48:31.094 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:31.094 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:31.094 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:31.095 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:31.101 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:31.102 INFO DAGScheduler - Registering RDD 229 (mapToPair at SparkUtils.java:161) as input to shuffle 13
19:48:31.102 INFO DAGScheduler - Got job 44 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:31.102 INFO DAGScheduler - Final stage: ResultStage 60 (runJob at SparkHadoopWriter.scala:83)
19:48:31.102 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 59)
19:48:31.102 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 59)
19:48:31.102 INFO DAGScheduler - Submitting ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:31.119 INFO MemoryStore - Block broadcast_102 stored as values in memory (estimated size 434.3 KiB, free 1917.2 MiB)
19:48:31.121 INFO MemoryStore - Block broadcast_102_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1917.1 MiB)
19:48:31.121 INFO BlockManagerInfo - Added broadcast_102_piece0 in memory on localhost:36125 (size: 157.6 KiB, free: 1919.4 MiB)
19:48:31.121 INFO SparkContext - Created broadcast 102 from broadcast at DAGScheduler.scala:1580
19:48:31.121 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:31.121 INFO TaskSchedulerImpl - Adding task set 59.0 with 1 tasks resource profile 0
19:48:31.122 INFO TaskSetManager - Starting task 0.0 in stage 59.0 (TID 97) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
19:48:31.123 INFO Executor - Running task 0.0 in stage 59.0 (TID 97)
19:48:31.154 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:31.168 INFO Executor - Finished task 0.0 in stage 59.0 (TID 97). 1148 bytes result sent to driver
19:48:31.169 INFO TaskSetManager - Finished task 0.0 in stage 59.0 (TID 97) in 47 ms on localhost (executor driver) (1/1)
19:48:31.169 INFO TaskSchedulerImpl - Removed TaskSet 59.0, whose tasks have all completed, from pool
19:48:31.169 INFO DAGScheduler - ShuffleMapStage 59 (mapToPair at SparkUtils.java:161) finished in 0.066 s
19:48:31.169 INFO DAGScheduler - looking for newly runnable stages
19:48:31.169 INFO DAGScheduler - running: HashSet()
19:48:31.169 INFO DAGScheduler - waiting: HashSet(ResultStage 60)
19:48:31.169 INFO DAGScheduler - failed: HashSet()
19:48:31.170 INFO DAGScheduler - Submitting ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91), which has no missing parents
19:48:31.181 INFO MemoryStore - Block broadcast_103 stored as values in memory (estimated size 155.4 KiB, free 1916.9 MiB)
19:48:31.182 INFO MemoryStore - Block broadcast_103_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1916.8 MiB)
19:48:31.182 INFO BlockManagerInfo - Added broadcast_103_piece0 in memory on localhost:36125 (size: 58.5 KiB, free: 1919.3 MiB)
19:48:31.182 INFO SparkContext - Created broadcast 103 from broadcast at DAGScheduler.scala:1580
19:48:31.182 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:31.182 INFO TaskSchedulerImpl - Adding task set 60.0 with 1 tasks resource profile 0
19:48:31.183 INFO TaskSetManager - Starting task 0.0 in stage 60.0 (TID 98) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:31.184 INFO Executor - Running task 0.0 in stage 60.0 (TID 98)
19:48:31.190 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:31.190 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:31.212 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:31.212 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:31.212 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:31.212 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:31.212 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:31.212 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:31.214 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:31.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:31.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:31.219 INFO StateChange - BLOCK* allocate blk_1073741859_1035, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/part-r-00000
19:48:31.220 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741859_1035 src: /127.0.0.1:50714 dest: /127.0.0.1:45925
19:48:31.224 INFO clienttrace - src: /127.0.0.1:50714, dest: /127.0.0.1:45925, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741859_1035, duration(ns): 3356482
19:48:31.224 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741859_1035, type=LAST_IN_PIPELINE terminating
19:48:31.225 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:31.226 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:31.226 INFO StateChange - BLOCK* allocate blk_1073741860_1036, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.sbi
19:48:31.227 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741860_1036 src: /127.0.0.1:50718 dest: /127.0.0.1:45925
19:48:31.228 INFO clienttrace - src: /127.0.0.1:50718, dest: /127.0.0.1:45925, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741860_1036, duration(ns): 350905
19:48:31.228 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741860_1036, type=LAST_IN_PIPELINE terminating
19:48:31.229 INFO FSNamesystem - BLOCK* blk_1073741860_1036 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.sbi
19:48:31.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741853_1029 replica FinalizedReplica, blk_1073741853_1029, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741853 for deletion
19:48:31.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741854_1030 replica FinalizedReplica, blk_1073741854_1030, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741854 for deletion
19:48:31.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741853_1029 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741853
19:48:31.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741854_1030 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741854
19:48:31.629 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:31.631 INFO StateChange - BLOCK* allocate blk_1073741861_1037, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.bai
19:48:31.631 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741861_1037 src: /127.0.0.1:50726 dest: /127.0.0.1:45925
19:48:31.633 INFO clienttrace - src: /127.0.0.1:50726, dest: /127.0.0.1:45925, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741861_1037, duration(ns): 432913
19:48:31.633 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741861_1037, type=LAST_IN_PIPELINE terminating
19:48:31.633 INFO FSNamesystem - BLOCK* blk_1073741861_1037 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.bai
19:48:32.034 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:32.035 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0 dst=null perm=null proto=rpc
19:48:32.036 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0 dst=null perm=null proto=rpc
19:48:32.036 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/task_202507151948318041623345337666110_0234_r_000000 dst=null perm=null proto=rpc
19:48:32.037 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/_temporary/attempt_202507151948318041623345337666110_0234_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/task_202507151948318041623345337666110_0234_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:32.037 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948318041623345337666110_0234_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/task_202507151948318041623345337666110_0234_r_000000
19:48:32.037 INFO SparkHadoopMapRedUtil - attempt_202507151948318041623345337666110_0234_r_000000_0: Committed. Elapsed time: 2 ms.
19:48:32.038 INFO Executor - Finished task 0.0 in stage 60.0 (TID 98). 1858 bytes result sent to driver
19:48:32.039 INFO TaskSetManager - Finished task 0.0 in stage 60.0 (TID 98) in 856 ms on localhost (executor driver) (1/1)
19:48:32.039 INFO TaskSchedulerImpl - Removed TaskSet 60.0, whose tasks have all completed, from pool
19:48:32.039 INFO DAGScheduler - ResultStage 60 (runJob at SparkHadoopWriter.scala:83) finished in 0.869 s
19:48:32.039 INFO DAGScheduler - Job 44 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:32.039 INFO TaskSchedulerImpl - Killing all running tasks in stage 60: Stage finished
19:48:32.040 INFO DAGScheduler - Job 44 finished: runJob at SparkHadoopWriter.scala:83, took 0.938263 s
19:48:32.040 INFO SparkHadoopWriter - Start to commit write Job job_202507151948318041623345337666110_0234.
19:48:32.041 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:32.042 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts dst=null perm=null proto=rpc
19:48:32.042 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/task_202507151948318041623345337666110_0234_r_000000 dst=null perm=null proto=rpc
19:48:32.043 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:32.043 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/task_202507151948318041623345337666110_0234_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.044 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:32.044 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/task_202507151948318041623345337666110_0234_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:32.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary/0/task_202507151948318041623345337666110_0234_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.046 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:32.047 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.047 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:32.048 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.spark-staging-234 dst=null perm=null proto=rpc
19:48:32.049 INFO SparkHadoopWriter - Write Job job_202507151948318041623345337666110_0234 committed. Elapsed time: 8 ms.
19:48:32.049 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.050 INFO StateChange - BLOCK* allocate blk_1073741862_1038, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/header
19:48:32.051 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741862_1038 src: /127.0.0.1:50738 dest: /127.0.0.1:45925
19:48:32.053 INFO clienttrace - src: /127.0.0.1:50738, dest: /127.0.0.1:45925, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741862_1038, duration(ns): 493354
19:48:32.053 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741862_1038, type=LAST_IN_PIPELINE terminating
19:48:32.054 INFO FSNamesystem - BLOCK* blk_1073741862_1038 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/header
19:48:32.455 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:32.456 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.457 INFO StateChange - BLOCK* allocate blk_1073741863_1039, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/terminator
19:48:32.458 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741863_1039 src: /127.0.0.1:50754 dest: /127.0.0.1:45925
19:48:32.459 INFO clienttrace - src: /127.0.0.1:50754, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741863_1039, duration(ns): 373199
19:48:32.459 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741863_1039, type=LAST_IN_PIPELINE terminating
19:48:32.459 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:32.460 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts dst=null perm=null proto=rpc
19:48:32.461 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.461 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:32.462 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam
19:48:32.462 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.463 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam done
19:48:32.464 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.464 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi
19:48:32.464 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts dst=null perm=null proto=rpc
19:48:32.465 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:32.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:32.467 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
19:48:32.468 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:32.469 INFO StateChange - BLOCK* allocate blk_1073741864_1040, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi
19:48:32.469 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741864_1040 src: /127.0.0.1:50770 dest: /127.0.0.1:45925
19:48:32.470 INFO clienttrace - src: /127.0.0.1:50770, dest: /127.0.0.1:45925, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741864_1040, duration(ns): 352527
19:48:32.470 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741864_1040, type=LAST_IN_PIPELINE terminating
19:48:32.471 INFO FSNamesystem - BLOCK* blk_1073741864_1040 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi
19:48:32.872 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:32.872 INFO IndexFileMerger - Done merging .sbi files
19:48:32.872 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai
19:48:32.873 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts dst=null perm=null proto=rpc
19:48:32.873 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:32.874 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:32.875 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:32.876 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:32.876 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:32.877 INFO StateChange - BLOCK* allocate blk_1073741865_1041, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai
19:48:32.878 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741865_1041 src: /127.0.0.1:50772 dest: /127.0.0.1:45925
19:48:32.879 INFO clienttrace - src: /127.0.0.1:50772, dest: /127.0.0.1:45925, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741865_1041, duration(ns): 449372
19:48:32.879 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741865_1041, type=LAST_IN_PIPELINE terminating
19:48:32.880 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:32.881 INFO IndexFileMerger - Done merging .bai files
19:48:32.881 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.parts dst=null perm=null proto=rpc
19:48:32.890 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:32.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi dst=null perm=null proto=rpc
19:48:32.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi dst=null perm=null proto=rpc
19:48:32.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi dst=null perm=null proto=rpc
19:48:32.900 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
19:48:32.900 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.901 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.901 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.902 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.903 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:32.903 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:32.904 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:32.905 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
19:48:32.906 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:32.907 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:32.907 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi dst=null perm=null proto=rpc
19:48:32.908 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi dst=null perm=null proto=rpc
19:48:32.908 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.sbi dst=null perm=null proto=rpc
19:48:32.909 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
19:48:32.910 INFO MemoryStore - Block broadcast_104 stored as values in memory (estimated size 312.0 B, free 1916.8 MiB)
19:48:32.910 INFO MemoryStore - Block broadcast_104_piece0 stored as bytes in memory (estimated size 231.0 B, free 1916.8 MiB)
19:48:32.910 INFO BlockManagerInfo - Added broadcast_104_piece0 in memory on localhost:36125 (size: 231.0 B, free: 1919.3 MiB)
19:48:32.911 INFO SparkContext - Created broadcast 104 from broadcast at BamSource.java:104
19:48:32.912 INFO MemoryStore - Block broadcast_105 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
19:48:32.923 INFO MemoryStore - Block broadcast_105_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.5 MiB)
19:48:32.923 INFO BlockManagerInfo - Added broadcast_105_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:32.923 INFO SparkContext - Created broadcast 105 from newAPIHadoopFile at PathSplitSource.java:96
19:48:32.933 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.934 INFO FileInputFormat - Total input files to process : 1
19:48:32.934 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.949 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:32.949 INFO DAGScheduler - Got job 45 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:32.949 INFO DAGScheduler - Final stage: ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:32.949 INFO DAGScheduler - Parents of final stage: List()
19:48:32.949 INFO DAGScheduler - Missing parents: List()
19:48:32.949 INFO DAGScheduler - Submitting ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:32.955 INFO MemoryStore - Block broadcast_106 stored as values in memory (estimated size 148.2 KiB, free 1916.4 MiB)
19:48:32.956 INFO MemoryStore - Block broadcast_106_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.3 MiB)
19:48:32.956 INFO BlockManagerInfo - Added broadcast_106_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.2 MiB)
19:48:32.956 INFO SparkContext - Created broadcast 106 from broadcast at DAGScheduler.scala:1580
19:48:32.957 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:32.957 INFO TaskSchedulerImpl - Adding task set 61.0 with 1 tasks resource profile 0
19:48:32.957 INFO TaskSetManager - Starting task 0.0 in stage 61.0 (TID 99) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:32.958 INFO Executor - Running task 0.0 in stage 61.0 (TID 99)
19:48:32.969 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam:0+236517
19:48:32.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:32.972 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:32.972 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:32.973 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:32.974 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
19:48:32.976 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:32.976 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:32.978 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
19:48:32.978 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:32.981 INFO Executor - Finished task 0.0 in stage 61.0 (TID 99). 749513 bytes result sent to driver
19:48:32.984 INFO TaskSetManager - Finished task 0.0 in stage 61.0 (TID 99) in 27 ms on localhost (executor driver) (1/1)
19:48:32.984 INFO TaskSchedulerImpl - Removed TaskSet 61.0, whose tasks have all completed, from pool
19:48:32.984 INFO DAGScheduler - ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.034 s
19:48:32.984 INFO DAGScheduler - Job 45 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:32.984 INFO TaskSchedulerImpl - Killing all running tasks in stage 61: Stage finished
19:48:32.984 INFO DAGScheduler - Job 45 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.035679 s
19:48:32.999 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:32.999 INFO DAGScheduler - Got job 46 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:32.999 INFO DAGScheduler - Final stage: ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185)
19:48:32.999 INFO DAGScheduler - Parents of final stage: List()
19:48:32.999 INFO DAGScheduler - Missing parents: List()
19:48:32.999 INFO DAGScheduler - Submitting ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:33.017 INFO MemoryStore - Block broadcast_107 stored as values in memory (estimated size 426.1 KiB, free 1915.9 MiB)
19:48:33.018 INFO MemoryStore - Block broadcast_107_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
19:48:33.018 INFO BlockManagerInfo - Added broadcast_107_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.1 MiB)
19:48:33.018 INFO SparkContext - Created broadcast 107 from broadcast at DAGScheduler.scala:1580
19:48:33.019 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:33.019 INFO TaskSchedulerImpl - Adding task set 62.0 with 1 tasks resource profile 0
19:48:33.019 INFO TaskSetManager - Starting task 0.0 in stage 62.0 (TID 100) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
19:48:33.020 INFO Executor - Running task 0.0 in stage 62.0 (TID 100)
19:48:33.062 INFO BlockManagerInfo - Removed broadcast_97_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.1 MiB)
19:48:33.063 INFO BlockManagerInfo - Removed broadcast_94_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:33.064 INFO BlockManagerInfo - Removed broadcast_106_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.2 MiB)
19:48:33.064 INFO BlockManagerInfo - Removed broadcast_96_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.4 MiB)
19:48:33.065 INFO BlockManagerInfo - Removed broadcast_100_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.4 MiB)
19:48:33.065 INFO BlockManagerInfo - Removed broadcast_101_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.4 MiB)
19:48:33.066 INFO BlockManagerInfo - Removed broadcast_95_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.4 MiB)
19:48:33.067 INFO BlockManagerInfo - Removed broadcast_87_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.5 MiB)
19:48:33.067 INFO BlockManagerInfo - Removed broadcast_99_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:33.068 INFO BlockManagerInfo - Removed broadcast_102_piece0 on localhost:36125 in memory (size: 157.6 KiB, free: 1919.7 MiB)
19:48:33.069 INFO BlockManagerInfo - Removed broadcast_93_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.7 MiB)
19:48:33.070 INFO BlockManagerInfo - Removed broadcast_103_piece0 on localhost:36125 in memory (size: 58.5 KiB, free: 1919.8 MiB)
19:48:33.073 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:33.083 INFO Executor - Finished task 0.0 in stage 62.0 (TID 100). 1032 bytes result sent to driver
19:48:33.083 INFO TaskSetManager - Finished task 0.0 in stage 62.0 (TID 100) in 64 ms on localhost (executor driver) (1/1)
19:48:33.084 INFO TaskSchedulerImpl - Removed TaskSet 62.0, whose tasks have all completed, from pool
19:48:33.084 INFO DAGScheduler - ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.084 s
19:48:33.084 INFO DAGScheduler - Job 46 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:33.084 INFO TaskSchedulerImpl - Killing all running tasks in stage 62: Stage finished
19:48:33.084 INFO DAGScheduler - Job 46 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.085231 s
19:48:33.088 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:33.088 INFO DAGScheduler - Got job 47 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:33.088 INFO DAGScheduler - Final stage: ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185)
19:48:33.088 INFO DAGScheduler - Parents of final stage: List()
19:48:33.088 INFO DAGScheduler - Missing parents: List()
19:48:33.088 INFO DAGScheduler - Submitting ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:33.095 INFO MemoryStore - Block broadcast_108 stored as values in memory (estimated size 148.1 KiB, free 1918.6 MiB)
19:48:33.095 INFO MemoryStore - Block broadcast_108_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.6 MiB)
19:48:33.096 INFO BlockManagerInfo - Added broadcast_108_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.7 MiB)
19:48:33.096 INFO SparkContext - Created broadcast 108 from broadcast at DAGScheduler.scala:1580
19:48:33.096 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:33.096 INFO TaskSchedulerImpl - Adding task set 63.0 with 1 tasks resource profile 0
19:48:33.097 INFO TaskSetManager - Starting task 0.0 in stage 63.0 (TID 101) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:33.097 INFO Executor - Running task 0.0 in stage 63.0 (TID 101)
19:48:33.109 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam:0+236517
19:48:33.110 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:33.111 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam dst=null perm=null proto=rpc
19:48:33.112 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:33.112 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:33.112 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_8b3e77d6-41e9-4ae3-ba5e-dd06af456981.bam.bai dst=null perm=null proto=rpc
19:48:33.114 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
19:48:33.115 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:33.116 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:33.117 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:33.118 INFO Executor - Finished task 0.0 in stage 63.0 (TID 101). 989 bytes result sent to driver
19:48:33.119 INFO TaskSetManager - Finished task 0.0 in stage 63.0 (TID 101) in 22 ms on localhost (executor driver) (1/1)
19:48:33.119 INFO TaskSchedulerImpl - Removed TaskSet 63.0, whose tasks have all completed, from pool
19:48:33.119 INFO DAGScheduler - ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.030 s
19:48:33.119 INFO DAGScheduler - Job 47 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:33.119 INFO TaskSchedulerImpl - Killing all running tasks in stage 63: Stage finished
19:48:33.119 INFO DAGScheduler - Job 47 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.031672 s
19:48:33.124 INFO MemoryStore - Block broadcast_109 stored as values in memory (estimated size 576.0 B, free 1918.6 MiB)
19:48:33.127 INFO MemoryStore - Block broadcast_109_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.6 MiB)
19:48:33.127 INFO BlockManagerInfo - Added broadcast_109_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.7 MiB)
19:48:33.127 INFO SparkContext - Created broadcast 109 from broadcast at CramSource.java:114
19:48:33.128 INFO MemoryStore - Block broadcast_110 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
19:48:33.135 INFO MemoryStore - Block broadcast_110_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
19:48:33.135 INFO BlockManagerInfo - Added broadcast_110_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:33.135 INFO SparkContext - Created broadcast 110 from newAPIHadoopFile at PathSplitSource.java:96
19:48:33.152 INFO MemoryStore - Block broadcast_111 stored as values in memory (estimated size 576.0 B, free 1918.2 MiB)
19:48:33.152 INFO MemoryStore - Block broadcast_111_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.2 MiB)
19:48:33.153 INFO BlockManagerInfo - Added broadcast_111_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.6 MiB)
19:48:33.153 INFO SparkContext - Created broadcast 111 from broadcast at CramSource.java:114
19:48:33.154 INFO MemoryStore - Block broadcast_112 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
19:48:33.160 INFO MemoryStore - Block broadcast_112_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
19:48:33.160 INFO BlockManagerInfo - Added broadcast_112_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:33.160 INFO SparkContext - Created broadcast 112 from newAPIHadoopFile at PathSplitSource.java:96
19:48:33.175 INFO FileInputFormat - Total input files to process : 1
19:48:33.176 INFO MemoryStore - Block broadcast_113 stored as values in memory (estimated size 6.0 KiB, free 1917.9 MiB)
19:48:33.177 INFO MemoryStore - Block broadcast_113_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.9 MiB)
19:48:33.177 INFO BlockManagerInfo - Added broadcast_113_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.6 MiB)
19:48:33.177 INFO SparkContext - Created broadcast 113 from broadcast at ReadsSparkSink.java:133
19:48:33.178 INFO MemoryStore - Block broadcast_114 stored as values in memory (estimated size 6.2 KiB, free 1917.9 MiB)
19:48:33.178 INFO MemoryStore - Block broadcast_114_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.9 MiB)
19:48:33.179 INFO BlockManagerInfo - Added broadcast_114_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.6 MiB)
19:48:33.179 INFO SparkContext - Created broadcast 114 from broadcast at CramSink.java:76
19:48:33.183 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts dst=null perm=null proto=rpc
19:48:33.184 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:33.184 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:33.184 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:33.185 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:33.191 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:33.192 INFO DAGScheduler - Registering RDD 252 (mapToPair at SparkUtils.java:161) as input to shuffle 14
19:48:33.192 INFO DAGScheduler - Got job 48 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:33.192 INFO DAGScheduler - Final stage: ResultStage 65 (runJob at SparkHadoopWriter.scala:83)
19:48:33.192 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 64)
19:48:33.192 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 64)
19:48:33.192 INFO DAGScheduler - Submitting ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:33.213 INFO MemoryStore - Block broadcast_115 stored as values in memory (estimated size 292.8 KiB, free 1917.6 MiB)
19:48:33.214 INFO MemoryStore - Block broadcast_115_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.5 MiB)
19:48:33.214 INFO BlockManagerInfo - Added broadcast_115_piece0 in memory on localhost:36125 (size: 107.3 KiB, free: 1919.5 MiB)
19:48:33.214 INFO SparkContext - Created broadcast 115 from broadcast at DAGScheduler.scala:1580
19:48:33.214 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:33.214 INFO TaskSchedulerImpl - Adding task set 64.0 with 1 tasks resource profile 0
19:48:33.215 INFO TaskSetManager - Starting task 0.0 in stage 64.0 (TID 102) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
19:48:33.216 INFO Executor - Running task 0.0 in stage 64.0 (TID 102)
19:48:33.238 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:33.263 INFO Executor - Finished task 0.0 in stage 64.0 (TID 102). 1148 bytes result sent to driver
19:48:33.264 INFO TaskSetManager - Finished task 0.0 in stage 64.0 (TID 102) in 49 ms on localhost (executor driver) (1/1)
19:48:33.264 INFO TaskSchedulerImpl - Removed TaskSet 64.0, whose tasks have all completed, from pool
19:48:33.264 INFO DAGScheduler - ShuffleMapStage 64 (mapToPair at SparkUtils.java:161) finished in 0.071 s
19:48:33.264 INFO DAGScheduler - looking for newly runnable stages
19:48:33.264 INFO DAGScheduler - running: HashSet()
19:48:33.264 INFO DAGScheduler - waiting: HashSet(ResultStage 65)
19:48:33.264 INFO DAGScheduler - failed: HashSet()
19:48:33.264 INFO DAGScheduler - Submitting ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89), which has no missing parents
19:48:33.273 INFO MemoryStore - Block broadcast_116 stored as values in memory (estimated size 153.3 KiB, free 1917.3 MiB)
19:48:33.273 INFO MemoryStore - Block broadcast_116_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1917.3 MiB)
19:48:33.274 INFO BlockManagerInfo - Added broadcast_116_piece0 in memory on localhost:36125 (size: 58.1 KiB, free: 1919.4 MiB)
19:48:33.274 INFO SparkContext - Created broadcast 116 from broadcast at DAGScheduler.scala:1580
19:48:33.274 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
19:48:33.274 INFO TaskSchedulerImpl - Adding task set 65.0 with 1 tasks resource profile 0
19:48:33.275 INFO TaskSetManager - Starting task 0.0 in stage 65.0 (TID 103) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:33.277 INFO Executor - Running task 0.0 in stage 65.0 (TID 103)
19:48:33.284 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:33.284 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:33.292 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:33.292 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:33.292 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:33.292 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:33.292 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:33.292 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:33.295 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/_temporary/attempt_20250715194833342195473839734029_0257_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:33.407 INFO StateChange - BLOCK* allocate blk_1073741866_1042, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/_temporary/attempt_20250715194833342195473839734029_0257_r_000000_0/part-r-00000
19:48:33.408 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741866_1042 src: /127.0.0.1:50780 dest: /127.0.0.1:45925
19:48:33.409 INFO clienttrace - src: /127.0.0.1:50780, dest: /127.0.0.1:45925, bytes: 42659, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741866_1042, duration(ns): 590327
19:48:33.410 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741866_1042, type=LAST_IN_PIPELINE terminating
19:48:33.410 INFO FSNamesystem - BLOCK* blk_1073741866_1042 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/_temporary/attempt_20250715194833342195473839734029_0257_r_000000_0/part-r-00000
19:48:33.811 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/_temporary/attempt_20250715194833342195473839734029_0257_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:33.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/_temporary/attempt_20250715194833342195473839734029_0257_r_000000_0 dst=null perm=null proto=rpc
19:48:33.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/_temporary/attempt_20250715194833342195473839734029_0257_r_000000_0 dst=null perm=null proto=rpc
19:48:33.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/task_20250715194833342195473839734029_0257_r_000000 dst=null perm=null proto=rpc
19:48:33.814 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/_temporary/attempt_20250715194833342195473839734029_0257_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/task_20250715194833342195473839734029_0257_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:33.814 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194833342195473839734029_0257_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/task_20250715194833342195473839734029_0257_r_000000
19:48:33.814 INFO SparkHadoopMapRedUtil - attempt_20250715194833342195473839734029_0257_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:33.815 INFO Executor - Finished task 0.0 in stage 65.0 (TID 103). 1858 bytes result sent to driver
19:48:33.816 INFO TaskSetManager - Finished task 0.0 in stage 65.0 (TID 103) in 541 ms on localhost (executor driver) (1/1)
19:48:33.816 INFO TaskSchedulerImpl - Removed TaskSet 65.0, whose tasks have all completed, from pool
19:48:33.816 INFO DAGScheduler - ResultStage 65 (runJob at SparkHadoopWriter.scala:83) finished in 0.551 s
19:48:33.816 INFO DAGScheduler - Job 48 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:33.816 INFO TaskSchedulerImpl - Killing all running tasks in stage 65: Stage finished
19:48:33.816 INFO DAGScheduler - Job 48 finished: runJob at SparkHadoopWriter.scala:83, took 0.625183 s
19:48:33.817 INFO SparkHadoopWriter - Start to commit write Job job_20250715194833342195473839734029_0257.
19:48:33.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:33.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts dst=null perm=null proto=rpc
19:48:33.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/task_20250715194833342195473839734029_0257_r_000000 dst=null perm=null proto=rpc
19:48:33.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:33.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary/0/task_20250715194833342195473839734029_0257_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:33.820 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_temporary dst=null perm=null proto=rpc
19:48:33.820 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:33.821 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:33.822 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/.spark-staging-257 dst=null perm=null proto=rpc
19:48:33.822 INFO SparkHadoopWriter - Write Job job_20250715194833342195473839734029_0257 committed. Elapsed time: 5 ms.
19:48:33.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:33.825 INFO StateChange - BLOCK* allocate blk_1073741867_1043, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/header
19:48:33.826 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741867_1043 src: /127.0.0.1:50790 dest: /127.0.0.1:45925
19:48:33.827 INFO clienttrace - src: /127.0.0.1:50790, dest: /127.0.0.1:45925, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741867_1043, duration(ns): 427181
19:48:33.827 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741867_1043, type=LAST_IN_PIPELINE terminating
19:48:33.828 INFO FSNamesystem - BLOCK* blk_1073741867_1043 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/header
19:48:34.228 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:34.230 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:34.230 INFO StateChange - BLOCK* allocate blk_1073741868_1044, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/terminator
19:48:34.231 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741868_1044 src: /127.0.0.1:50796 dest: /127.0.0.1:45925
19:48:34.232 INFO clienttrace - src: /127.0.0.1:50796, dest: /127.0.0.1:45925, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741868_1044, duration(ns): 397665
19:48:34.232 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741868_1044, type=LAST_IN_PIPELINE terminating
19:48:34.233 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:34.233 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts dst=null perm=null proto=rpc
19:48:34.235 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:34.236 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:34.236 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram
19:48:34.236 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:34.237 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.237 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:34.237 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram done
19:48:34.238 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.parts dst=null perm=null proto=rpc
19:48:34.238 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.239 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.239 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.240 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.crai dst=null perm=null proto=rpc
19:48:34.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.crai dst=null perm=null proto=rpc
19:48:34.244 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:34.245 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.246 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.246 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.crai dst=null perm=null proto=rpc
19:48:34.246 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.crai dst=null perm=null proto=rpc
19:48:34.247 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.247 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.248 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
19:48:34.249 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:34.249 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
19:48:34.250 INFO MemoryStore - Block broadcast_117 stored as values in memory (estimated size 528.0 B, free 1917.3 MiB)
19:48:34.250 INFO MemoryStore - Block broadcast_117_piece0 stored as bytes in memory (estimated size 187.0 B, free 1917.3 MiB)
19:48:34.251 INFO BlockManagerInfo - Added broadcast_117_piece0 in memory on localhost:36125 (size: 187.0 B, free: 1919.4 MiB)
19:48:34.251 INFO SparkContext - Created broadcast 117 from broadcast at CramSource.java:114
19:48:34.252 INFO MemoryStore - Block broadcast_118 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
19:48:34.259 INFO MemoryStore - Block broadcast_118_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
19:48:34.259 INFO BlockManagerInfo - Added broadcast_118_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:34.259 INFO SparkContext - Created broadcast 118 from newAPIHadoopFile at PathSplitSource.java:96
19:48:34.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.275 INFO FileInputFormat - Total input files to process : 1
19:48:34.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.301 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:34.302 INFO DAGScheduler - Got job 49 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:34.302 INFO DAGScheduler - Final stage: ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:34.302 INFO DAGScheduler - Parents of final stage: List()
19:48:34.302 INFO DAGScheduler - Missing parents: List()
19:48:34.302 INFO DAGScheduler - Submitting ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:34.313 INFO MemoryStore - Block broadcast_119 stored as values in memory (estimated size 286.8 KiB, free 1916.6 MiB)
19:48:34.314 INFO MemoryStore - Block broadcast_119_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1916.5 MiB)
19:48:34.315 INFO BlockManagerInfo - Added broadcast_119_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.3 MiB)
19:48:34.315 INFO SparkContext - Created broadcast 119 from broadcast at DAGScheduler.scala:1580
19:48:34.315 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:34.315 INFO TaskSchedulerImpl - Adding task set 66.0 with 1 tasks resource profile 0
19:48:34.316 INFO TaskSetManager - Starting task 0.0 in stage 66.0 (TID 104) (localhost, executor driver, partition 0, ANY, 7853 bytes)
19:48:34.316 INFO Executor - Running task 0.0 in stage 66.0 (TID 104)
19:48:34.337 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram:0+43713
19:48:34.337 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.338 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.339 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.crai dst=null perm=null proto=rpc
19:48:34.339 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.crai dst=null perm=null proto=rpc
19:48:34.341 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
19:48:34.342 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:34.342 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
19:48:34.411 INFO Executor - Finished task 0.0 in stage 66.0 (TID 104). 154101 bytes result sent to driver
19:48:34.412 INFO TaskSetManager - Finished task 0.0 in stage 66.0 (TID 104) in 97 ms on localhost (executor driver) (1/1)
19:48:34.412 INFO TaskSchedulerImpl - Removed TaskSet 66.0, whose tasks have all completed, from pool
19:48:34.412 INFO DAGScheduler - ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.110 s
19:48:34.412 INFO DAGScheduler - Job 49 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:34.412 INFO TaskSchedulerImpl - Killing all running tasks in stage 66: Stage finished
19:48:34.412 INFO DAGScheduler - Job 49 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.111007 s
19:48:34.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741860_1036 replica FinalizedReplica, blk_1073741860_1036, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741860 for deletion
19:48:34.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741861_1037 replica FinalizedReplica, blk_1073741861_1037, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741861 for deletion
19:48:34.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741860_1036 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741860
19:48:34.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741861_1037 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741861
19:48:34.420 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:34.420 INFO DAGScheduler - Got job 50 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:34.420 INFO DAGScheduler - Final stage: ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185)
19:48:34.420 INFO DAGScheduler - Parents of final stage: List()
19:48:34.420 INFO DAGScheduler - Missing parents: List()
19:48:34.421 INFO DAGScheduler - Submitting ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:34.432 INFO MemoryStore - Block broadcast_120 stored as values in memory (estimated size 286.8 KiB, free 1916.3 MiB)
19:48:34.433 INFO MemoryStore - Block broadcast_120_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1916.2 MiB)
19:48:34.433 INFO BlockManagerInfo - Added broadcast_120_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.2 MiB)
19:48:34.434 INFO SparkContext - Created broadcast 120 from broadcast at DAGScheduler.scala:1580
19:48:34.434 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:34.434 INFO TaskSchedulerImpl - Adding task set 67.0 with 1 tasks resource profile 0
19:48:34.434 INFO TaskSetManager - Starting task 0.0 in stage 67.0 (TID 105) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
19:48:34.435 INFO Executor - Running task 0.0 in stage 67.0 (TID 105)
19:48:34.455 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:34.466 INFO Executor - Finished task 0.0 in stage 67.0 (TID 105). 989 bytes result sent to driver
19:48:34.467 INFO TaskSetManager - Finished task 0.0 in stage 67.0 (TID 105) in 33 ms on localhost (executor driver) (1/1)
19:48:34.467 INFO TaskSchedulerImpl - Removed TaskSet 67.0, whose tasks have all completed, from pool
19:48:34.467 INFO DAGScheduler - ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.046 s
19:48:34.467 INFO DAGScheduler - Job 50 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:34.467 INFO TaskSchedulerImpl - Killing all running tasks in stage 67: Stage finished
19:48:34.468 INFO DAGScheduler - Job 50 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.047526 s
19:48:34.471 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:34.471 INFO DAGScheduler - Got job 51 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:34.471 INFO DAGScheduler - Final stage: ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185)
19:48:34.471 INFO DAGScheduler - Parents of final stage: List()
19:48:34.471 INFO DAGScheduler - Missing parents: List()
19:48:34.472 INFO DAGScheduler - Submitting ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:34.483 INFO MemoryStore - Block broadcast_121 stored as values in memory (estimated size 286.8 KiB, free 1915.9 MiB)
19:48:34.484 INFO MemoryStore - Block broadcast_121_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1915.8 MiB)
19:48:34.485 INFO BlockManagerInfo - Added broadcast_121_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.1 MiB)
19:48:34.485 INFO SparkContext - Created broadcast 121 from broadcast at DAGScheduler.scala:1580
19:48:34.485 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:34.485 INFO TaskSchedulerImpl - Adding task set 68.0 with 1 tasks resource profile 0
19:48:34.486 INFO TaskSetManager - Starting task 0.0 in stage 68.0 (TID 106) (localhost, executor driver, partition 0, ANY, 7853 bytes)
19:48:34.486 INFO Executor - Running task 0.0 in stage 68.0 (TID 106)
19:48:34.512 INFO BlockManagerInfo - Removed broadcast_112_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.1 MiB)
19:48:34.513 INFO BlockManagerInfo - Removed broadcast_104_piece0 on localhost:36125 in memory (size: 231.0 B, free: 1919.1 MiB)
19:48:34.514 INFO BlockManagerInfo - Removed broadcast_111_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.1 MiB)
19:48:34.514 INFO BlockManagerInfo - Removed broadcast_107_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:34.515 INFO BlockManagerInfo - Removed broadcast_108_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.3 MiB)
19:48:34.516 INFO BlockManagerInfo - Removed broadcast_98_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:34.517 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram:0+43713
19:48:34.517 INFO BlockManagerInfo - Removed broadcast_105_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:34.518 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.518 INFO BlockManagerInfo - Removed broadcast_113_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.4 MiB)
19:48:34.518 INFO BlockManagerInfo - Removed broadcast_114_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.4 MiB)
19:48:34.518 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram dst=null perm=null proto=rpc
19:48:34.519 INFO BlockManagerInfo - Removed broadcast_119_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.5 MiB)
19:48:34.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.cram.crai dst=null perm=null proto=rpc
19:48:34.520 INFO BlockManagerInfo - Removed broadcast_120_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.6 MiB)
19:48:34.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_1368e3de-b9ab-47e7-b32c-855b148c6a49.crai dst=null perm=null proto=rpc
19:48:34.520 INFO BlockManagerInfo - Removed broadcast_116_piece0 on localhost:36125 in memory (size: 58.1 KiB, free: 1919.7 MiB)
19:48:34.522 INFO BlockManagerInfo - Removed broadcast_115_piece0 on localhost:36125 in memory (size: 107.3 KiB, free: 1919.8 MiB)
19:48:34.523 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
19:48:34.523 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:34.524 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
19:48:34.543 INFO Executor - Finished task 0.0 in stage 68.0 (TID 106). 1032 bytes result sent to driver
19:48:34.544 INFO TaskSetManager - Finished task 0.0 in stage 68.0 (TID 106) in 59 ms on localhost (executor driver) (1/1)
19:48:34.544 INFO TaskSchedulerImpl - Removed TaskSet 68.0, whose tasks have all completed, from pool
19:48:34.544 INFO DAGScheduler - ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.072 s
19:48:34.544 INFO DAGScheduler - Job 51 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:34.544 INFO TaskSchedulerImpl - Killing all running tasks in stage 68: Stage finished
19:48:34.544 INFO DAGScheduler - Job 51 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.073182 s
19:48:34.549 INFO MemoryStore - Block broadcast_122 stored as values in memory (estimated size 297.9 KiB, free 1918.6 MiB)
19:48:34.559 INFO MemoryStore - Block broadcast_122_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.6 MiB)
19:48:34.559 INFO BlockManagerInfo - Added broadcast_122_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.8 MiB)
19:48:34.559 INFO SparkContext - Created broadcast 122 from newAPIHadoopFile at PathSplitSource.java:96
19:48:34.582 INFO MemoryStore - Block broadcast_123 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
19:48:34.588 INFO MemoryStore - Block broadcast_123_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.3 MiB)
19:48:34.588 INFO BlockManagerInfo - Added broadcast_123_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:34.588 INFO SparkContext - Created broadcast 123 from newAPIHadoopFile at PathSplitSource.java:96
19:48:34.608 INFO FileInputFormat - Total input files to process : 1
19:48:34.610 INFO MemoryStore - Block broadcast_124 stored as values in memory (estimated size 160.7 KiB, free 1918.1 MiB)
19:48:34.611 INFO MemoryStore - Block broadcast_124_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.1 MiB)
19:48:34.611 INFO BlockManagerInfo - Added broadcast_124_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.7 MiB)
19:48:34.611 INFO SparkContext - Created broadcast 124 from broadcast at ReadsSparkSink.java:133
19:48:34.622 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts dst=null perm=null proto=rpc
19:48:34.623 INFO deprecation - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
19:48:34.624 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:34.624 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:34.624 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:34.625 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:34.638 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:34.638 INFO DAGScheduler - Registering RDD 277 (mapToPair at SparkUtils.java:161) as input to shuffle 15
19:48:34.638 INFO DAGScheduler - Got job 52 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:34.638 INFO DAGScheduler - Final stage: ResultStage 70 (runJob at SparkHadoopWriter.scala:83)
19:48:34.638 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 69)
19:48:34.639 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 69)
19:48:34.639 INFO DAGScheduler - Submitting ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:34.657 INFO MemoryStore - Block broadcast_125 stored as values in memory (estimated size 520.4 KiB, free 1917.6 MiB)
19:48:34.658 INFO MemoryStore - Block broadcast_125_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.4 MiB)
19:48:34.658 INFO BlockManagerInfo - Added broadcast_125_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.5 MiB)
19:48:34.658 INFO SparkContext - Created broadcast 125 from broadcast at DAGScheduler.scala:1580
19:48:34.659 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:34.659 INFO TaskSchedulerImpl - Adding task set 69.0 with 1 tasks resource profile 0
19:48:34.659 INFO TaskSetManager - Starting task 0.0 in stage 69.0 (TID 107) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:34.660 INFO Executor - Running task 0.0 in stage 69.0 (TID 107)
19:48:34.690 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:34.708 INFO Executor - Finished task 0.0 in stage 69.0 (TID 107). 1148 bytes result sent to driver
19:48:34.708 INFO TaskSetManager - Finished task 0.0 in stage 69.0 (TID 107) in 49 ms on localhost (executor driver) (1/1)
19:48:34.708 INFO TaskSchedulerImpl - Removed TaskSet 69.0, whose tasks have all completed, from pool
19:48:34.708 INFO DAGScheduler - ShuffleMapStage 69 (mapToPair at SparkUtils.java:161) finished in 0.069 s
19:48:34.708 INFO DAGScheduler - looking for newly runnable stages
19:48:34.708 INFO DAGScheduler - running: HashSet()
19:48:34.708 INFO DAGScheduler - waiting: HashSet(ResultStage 70)
19:48:34.709 INFO DAGScheduler - failed: HashSet()
19:48:34.709 INFO DAGScheduler - Submitting ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65), which has no missing parents
19:48:34.716 INFO MemoryStore - Block broadcast_126 stored as values in memory (estimated size 241.1 KiB, free 1917.2 MiB)
19:48:34.717 INFO MemoryStore - Block broadcast_126_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1917.1 MiB)
19:48:34.717 INFO BlockManagerInfo - Added broadcast_126_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.5 MiB)
19:48:34.717 INFO SparkContext - Created broadcast 126 from broadcast at DAGScheduler.scala:1580
19:48:34.717 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
19:48:34.717 INFO TaskSchedulerImpl - Adding task set 70.0 with 1 tasks resource profile 0
19:48:34.718 INFO TaskSetManager - Starting task 0.0 in stage 70.0 (TID 108) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:34.718 INFO Executor - Running task 0.0 in stage 70.0 (TID 108)
19:48:34.723 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:34.723 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:34.736 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:34.737 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:34.737 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:34.738 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/_temporary/attempt_202507151948347635371293966205249_0283_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:34.741 INFO StateChange - BLOCK* allocate blk_1073741869_1045, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/_temporary/attempt_202507151948347635371293966205249_0283_m_000000_0/part-00000
19:48:34.743 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741869_1045 src: /127.0.0.1:50806 dest: /127.0.0.1:45925
19:48:34.750 INFO clienttrace - src: /127.0.0.1:50806, dest: /127.0.0.1:45925, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741869_1045, duration(ns): 4915010
19:48:34.750 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741869_1045, type=LAST_IN_PIPELINE terminating
19:48:34.750 INFO FSNamesystem - BLOCK* blk_1073741869_1045 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/_temporary/attempt_202507151948347635371293966205249_0283_m_000000_0/part-00000
19:48:35.151 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/_temporary/attempt_202507151948347635371293966205249_0283_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:35.152 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/_temporary/attempt_202507151948347635371293966205249_0283_m_000000_0 dst=null perm=null proto=rpc
19:48:35.153 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/_temporary/attempt_202507151948347635371293966205249_0283_m_000000_0 dst=null perm=null proto=rpc
19:48:35.153 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/task_202507151948347635371293966205249_0283_m_000000 dst=null perm=null proto=rpc
19:48:35.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/_temporary/attempt_202507151948347635371293966205249_0283_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/task_202507151948347635371293966205249_0283_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:35.154 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948347635371293966205249_0283_m_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/task_202507151948347635371293966205249_0283_m_000000
19:48:35.154 INFO SparkHadoopMapRedUtil - attempt_202507151948347635371293966205249_0283_m_000000_0: Committed. Elapsed time: 1 ms.
19:48:35.155 INFO Executor - Finished task 0.0 in stage 70.0 (TID 108). 1858 bytes result sent to driver
19:48:35.155 INFO TaskSetManager - Finished task 0.0 in stage 70.0 (TID 108) in 437 ms on localhost (executor driver) (1/1)
19:48:35.155 INFO TaskSchedulerImpl - Removed TaskSet 70.0, whose tasks have all completed, from pool
19:48:35.156 INFO DAGScheduler - ResultStage 70 (runJob at SparkHadoopWriter.scala:83) finished in 0.447 s
19:48:35.156 INFO DAGScheduler - Job 52 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.156 INFO TaskSchedulerImpl - Killing all running tasks in stage 70: Stage finished
19:48:35.156 INFO DAGScheduler - Job 52 finished: runJob at SparkHadoopWriter.scala:83, took 0.518275 s
19:48:35.156 INFO SparkHadoopWriter - Start to commit write Job job_202507151948347635371293966205249_0283.
19:48:35.157 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:35.157 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts dst=null perm=null proto=rpc
19:48:35.158 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/task_202507151948347635371293966205249_0283_m_000000 dst=null perm=null proto=rpc
19:48:35.158 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/part-00000 dst=null perm=null proto=rpc
19:48:35.159 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary/0/task_202507151948347635371293966205249_0283_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:35.159 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_temporary dst=null perm=null proto=rpc
19:48:35.160 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:35.161 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:35.161 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/.spark-staging-283 dst=null perm=null proto=rpc
19:48:35.162 INFO SparkHadoopWriter - Write Job job_202507151948347635371293966205249_0283 committed. Elapsed time: 5 ms.
19:48:35.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:35.165 INFO StateChange - BLOCK* allocate blk_1073741870_1046, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/header
19:48:35.166 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741870_1046 src: /127.0.0.1:50820 dest: /127.0.0.1:45925
19:48:35.167 INFO clienttrace - src: /127.0.0.1:50820, dest: /127.0.0.1:45925, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741870_1046, duration(ns): 664465
19:48:35.167 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741870_1046, type=LAST_IN_PIPELINE terminating
19:48:35.168 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:35.168 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts dst=null perm=null proto=rpc
19:48:35.169 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:35.170 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:35.170 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam
19:48:35.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:35.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:35.172 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam done
19:48:35.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam.parts dst=null perm=null proto=rpc
19:48:35.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.174 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.174 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
WARNING 2025-07-15 19:48:35 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:48:35.176 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
19:48:35.178 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:48:35.178 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.178 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.179 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
WARNING 2025-07-15 19:48:35 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:48:35.181 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:48:35.183 INFO MemoryStore - Block broadcast_127 stored as values in memory (estimated size 160.7 KiB, free 1917.0 MiB)
19:48:35.183 INFO MemoryStore - Block broadcast_127_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.0 MiB)
19:48:35.184 INFO BlockManagerInfo - Added broadcast_127_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:35.184 INFO SparkContext - Created broadcast 127 from broadcast at SamSource.java:78
19:48:35.185 INFO MemoryStore - Block broadcast_128 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
19:48:35.191 INFO MemoryStore - Block broadcast_128_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
19:48:35.191 INFO BlockManagerInfo - Added broadcast_128_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:35.192 INFO SparkContext - Created broadcast 128 from newAPIHadoopFile at SamSource.java:108
19:48:35.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.199 INFO FileInputFormat - Total input files to process : 1
19:48:35.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.209 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:35.210 INFO DAGScheduler - Got job 53 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:35.210 INFO DAGScheduler - Final stage: ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:35.210 INFO DAGScheduler - Parents of final stage: List()
19:48:35.210 INFO DAGScheduler - Missing parents: List()
19:48:35.210 INFO DAGScheduler - Submitting ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:35.210 INFO MemoryStore - Block broadcast_129 stored as values in memory (estimated size 7.5 KiB, free 1916.6 MiB)
19:48:35.211 INFO MemoryStore - Block broadcast_129_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1916.6 MiB)
19:48:35.211 INFO BlockManagerInfo - Added broadcast_129_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.4 MiB)
19:48:35.211 INFO SparkContext - Created broadcast 129 from broadcast at DAGScheduler.scala:1580
19:48:35.212 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:35.212 INFO TaskSchedulerImpl - Adding task set 71.0 with 1 tasks resource profile 0
19:48:35.212 INFO TaskSetManager - Starting task 0.0 in stage 71.0 (TID 109) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:35.212 INFO Executor - Running task 0.0 in stage 71.0 (TID 109)
19:48:35.214 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam:0+847558
19:48:35.218 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.254 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:48:35.263 INFO Executor - Finished task 0.0 in stage 71.0 (TID 109). 651526 bytes result sent to driver
19:48:35.264 INFO TaskSetManager - Finished task 0.0 in stage 71.0 (TID 109) in 52 ms on localhost (executor driver) (1/1)
19:48:35.264 INFO TaskSchedulerImpl - Removed TaskSet 71.0, whose tasks have all completed, from pool
19:48:35.265 INFO DAGScheduler - ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.055 s
19:48:35.265 INFO DAGScheduler - Job 53 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.265 INFO TaskSchedulerImpl - Killing all running tasks in stage 71: Stage finished
19:48:35.265 INFO DAGScheduler - Job 53 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.055572 s
19:48:35.280 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:35.281 INFO DAGScheduler - Got job 54 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:35.281 INFO DAGScheduler - Final stage: ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185)
19:48:35.281 INFO DAGScheduler - Parents of final stage: List()
19:48:35.281 INFO DAGScheduler - Missing parents: List()
19:48:35.281 INFO DAGScheduler - Submitting ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:35.297 INFO MemoryStore - Block broadcast_130 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
19:48:35.299 INFO MemoryStore - Block broadcast_130_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.0 MiB)
19:48:35.299 INFO BlockManagerInfo - Added broadcast_130_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.3 MiB)
19:48:35.299 INFO SparkContext - Created broadcast 130 from broadcast at DAGScheduler.scala:1580
19:48:35.300 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:35.300 INFO TaskSchedulerImpl - Adding task set 72.0 with 1 tasks resource profile 0
19:48:35.300 INFO TaskSetManager - Starting task 0.0 in stage 72.0 (TID 110) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:35.301 INFO Executor - Running task 0.0 in stage 72.0 (TID 110)
19:48:35.330 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:35.340 INFO Executor - Finished task 0.0 in stage 72.0 (TID 110). 989 bytes result sent to driver
19:48:35.340 INFO TaskSetManager - Finished task 0.0 in stage 72.0 (TID 110) in 40 ms on localhost (executor driver) (1/1)
19:48:35.340 INFO TaskSchedulerImpl - Removed TaskSet 72.0, whose tasks have all completed, from pool
19:48:35.341 INFO DAGScheduler - ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
19:48:35.341 INFO DAGScheduler - Job 54 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.341 INFO TaskSchedulerImpl - Killing all running tasks in stage 72: Stage finished
19:48:35.341 INFO DAGScheduler - Job 54 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060610 s
19:48:35.344 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:35.345 INFO DAGScheduler - Got job 55 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:35.345 INFO DAGScheduler - Final stage: ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185)
19:48:35.345 INFO DAGScheduler - Parents of final stage: List()
19:48:35.345 INFO DAGScheduler - Missing parents: List()
19:48:35.345 INFO DAGScheduler - Submitting ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:35.346 INFO MemoryStore - Block broadcast_131 stored as values in memory (estimated size 7.4 KiB, free 1916.0 MiB)
19:48:35.346 INFO MemoryStore - Block broadcast_131_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1916.0 MiB)
19:48:35.346 INFO BlockManagerInfo - Added broadcast_131_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.2 MiB)
19:48:35.347 INFO SparkContext - Created broadcast 131 from broadcast at DAGScheduler.scala:1580
19:48:35.347 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:35.347 INFO TaskSchedulerImpl - Adding task set 73.0 with 1 tasks resource profile 0
19:48:35.347 INFO TaskSetManager - Starting task 0.0 in stage 73.0 (TID 111) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:35.348 INFO Executor - Running task 0.0 in stage 73.0 (TID 111)
19:48:35.349 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam:0+847558
19:48:35.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_7d177dc2-9bda-4de9-a469-818510617eb8.sam dst=null perm=null proto=rpc
19:48:35.352 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
19:48:35.363 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:48:35.364 INFO Executor - Finished task 0.0 in stage 73.0 (TID 111). 989 bytes result sent to driver
19:48:35.364 INFO TaskSetManager - Finished task 0.0 in stage 73.0 (TID 111) in 17 ms on localhost (executor driver) (1/1)
19:48:35.364 INFO TaskSchedulerImpl - Removed TaskSet 73.0, whose tasks have all completed, from pool
19:48:35.364 INFO DAGScheduler - ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.019 s
19:48:35.364 INFO DAGScheduler - Job 55 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.364 INFO TaskSchedulerImpl - Killing all running tasks in stage 73: Stage finished
19:48:35.364 INFO DAGScheduler - Job 55 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.020076 s
19:48:35.368 INFO MemoryStore - Block broadcast_132 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
19:48:35.374 INFO MemoryStore - Block broadcast_132_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.7 MiB)
19:48:35.374 INFO BlockManagerInfo - Added broadcast_132_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:35.374 INFO SparkContext - Created broadcast 132 from newAPIHadoopFile at PathSplitSource.java:96
19:48:35.400 INFO MemoryStore - Block broadcast_133 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
19:48:35.407 INFO MemoryStore - Block broadcast_133_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
19:48:35.407 INFO BlockManagerInfo - Added broadcast_133_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:35.407 INFO SparkContext - Created broadcast 133 from newAPIHadoopFile at PathSplitSource.java:96
19:48:35.428 INFO MemoryStore - Block broadcast_134 stored as values in memory (estimated size 160.7 KiB, free 1915.2 MiB)
19:48:35.429 INFO MemoryStore - Block broadcast_134_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.2 MiB)
19:48:35.430 INFO BlockManagerInfo - Added broadcast_134_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:35.430 INFO SparkContext - Created broadcast 134 from broadcast at ReadsSparkSink.java:133
19:48:35.432 INFO MemoryStore - Block broadcast_135 stored as values in memory (estimated size 163.2 KiB, free 1915.0 MiB)
19:48:35.440 INFO MemoryStore - Block broadcast_135_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.0 MiB)
19:48:35.440 INFO BlockManagerInfo - Added broadcast_135_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:35.440 INFO BlockManagerInfo - Removed broadcast_131_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.1 MiB)
19:48:35.441 INFO SparkContext - Created broadcast 135 from broadcast at AnySamSinkMultiple.java:80
19:48:35.441 INFO BlockManagerInfo - Removed broadcast_129_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.1 MiB)
19:48:35.442 INFO BlockManagerInfo - Removed broadcast_121_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.2 MiB)
19:48:35.442 INFO BlockManagerInfo - Removed broadcast_117_piece0 on localhost:36125 in memory (size: 187.0 B, free: 1919.2 MiB)
19:48:35.443 INFO BlockManagerInfo - Removed broadcast_110_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:35.443 INFO BlockManagerInfo - Removed broadcast_123_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:35.444 INFO BlockManagerInfo - Removed broadcast_125_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.5 MiB)
19:48:35.445 INFO BlockManagerInfo - Removed broadcast_133_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:35.446 INFO BlockManagerInfo - Removed broadcast_118_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:35.446 INFO BlockManagerInfo - Removed broadcast_128_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:35.447 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.447 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.447 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.447 INFO BlockManagerInfo - Removed broadcast_130_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:35.448 INFO BlockManagerInfo - Removed broadcast_126_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.9 MiB)
19:48:35.448 INFO BlockManagerInfo - Removed broadcast_109_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.9 MiB)
19:48:35.449 INFO BlockManagerInfo - Removed broadcast_127_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:35.450 INFO BlockManagerInfo - Removed broadcast_122_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:35.450 INFO BlockManagerInfo - Removed broadcast_124_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:35.462 INFO FileInputFormat - Total input files to process : 1
19:48:35.470 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:35.471 INFO DAGScheduler - Registering RDD 296 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 16
19:48:35.471 INFO DAGScheduler - Got job 56 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:35.471 INFO DAGScheduler - Final stage: ResultStage 75 (runJob at SparkHadoopWriter.scala:83)
19:48:35.471 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 74)
19:48:35.471 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 74)
19:48:35.471 INFO DAGScheduler - Submitting ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:35.501 INFO MemoryStore - Block broadcast_136 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
19:48:35.503 INFO MemoryStore - Block broadcast_136_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.8 MiB)
19:48:35.503 INFO BlockManagerInfo - Added broadcast_136_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.8 MiB)
19:48:35.504 INFO SparkContext - Created broadcast 136 from broadcast at DAGScheduler.scala:1580
19:48:35.504 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:35.504 INFO TaskSchedulerImpl - Adding task set 74.0 with 1 tasks resource profile 0
19:48:35.505 INFO TaskSetManager - Starting task 0.0 in stage 74.0 (TID 112) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:35.505 INFO Executor - Running task 0.0 in stage 74.0 (TID 112)
19:48:35.540 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:35.559 INFO Executor - Finished task 0.0 in stage 74.0 (TID 112). 1149 bytes result sent to driver
19:48:35.560 INFO TaskSetManager - Finished task 0.0 in stage 74.0 (TID 112) in 56 ms on localhost (executor driver) (1/1)
19:48:35.560 INFO TaskSchedulerImpl - Removed TaskSet 74.0, whose tasks have all completed, from pool
19:48:35.560 INFO DAGScheduler - ShuffleMapStage 74 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.088 s
19:48:35.560 INFO DAGScheduler - looking for newly runnable stages
19:48:35.560 INFO DAGScheduler - running: HashSet()
19:48:35.560 INFO DAGScheduler - waiting: HashSet(ResultStage 75)
19:48:35.560 INFO DAGScheduler - failed: HashSet()
19:48:35.560 INFO DAGScheduler - Submitting ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:35.573 INFO MemoryStore - Block broadcast_137 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
19:48:35.574 INFO MemoryStore - Block broadcast_137_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1918.6 MiB)
19:48:35.574 INFO BlockManagerInfo - Added broadcast_137_piece0 in memory on localhost:36125 (size: 56.3 KiB, free: 1919.7 MiB)
19:48:35.574 INFO SparkContext - Created broadcast 137 from broadcast at DAGScheduler.scala:1580
19:48:35.574 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:35.574 INFO TaskSchedulerImpl - Adding task set 75.0 with 2 tasks resource profile 0
19:48:35.575 INFO TaskSetManager - Starting task 0.0 in stage 75.0 (TID 113) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:35.575 INFO TaskSetManager - Starting task 1.0 in stage 75.0 (TID 114) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:35.576 INFO Executor - Running task 0.0 in stage 75.0 (TID 113)
19:48:35.576 INFO Executor - Running task 1.0 in stage 75.0 (TID 114)
19:48:35.581 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.581 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.581 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.581 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.582 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.582 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.583 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.583 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.583 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.583 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.583 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.583 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.595 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:35.595 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:35.597 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:35.597 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:35.605 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948354022456107156463385_0308_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest111040959680649232863.bam/_temporary/0/task_202507151948354022456107156463385_0308_r_000000
19:48:35.605 INFO SparkHadoopMapRedUtil - attempt_202507151948354022456107156463385_0308_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:35.606 INFO Executor - Finished task 0.0 in stage 75.0 (TID 113). 1729 bytes result sent to driver
19:48:35.606 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948354022456107156463385_0308_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest111040959680649232863.bam/_temporary/0/task_202507151948354022456107156463385_0308_r_000001
19:48:35.606 INFO SparkHadoopMapRedUtil - attempt_202507151948354022456107156463385_0308_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:35.607 INFO Executor - Finished task 1.0 in stage 75.0 (TID 114). 1729 bytes result sent to driver
19:48:35.607 INFO TaskSetManager - Finished task 0.0 in stage 75.0 (TID 113) in 32 ms on localhost (executor driver) (1/2)
19:48:35.607 INFO TaskSetManager - Finished task 1.0 in stage 75.0 (TID 114) in 32 ms on localhost (executor driver) (2/2)
19:48:35.607 INFO TaskSchedulerImpl - Removed TaskSet 75.0, whose tasks have all completed, from pool
19:48:35.607 INFO DAGScheduler - ResultStage 75 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
19:48:35.607 INFO DAGScheduler - Job 56 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.607 INFO TaskSchedulerImpl - Killing all running tasks in stage 75: Stage finished
19:48:35.608 INFO DAGScheduler - Job 56 finished: runJob at SparkHadoopWriter.scala:83, took 0.136996 s
19:48:35.608 INFO SparkHadoopWriter - Start to commit write Job job_202507151948354022456107156463385_0308.
19:48:35.613 INFO SparkHadoopWriter - Write Job job_202507151948354022456107156463385_0308 committed. Elapsed time: 5 ms.
19:48:35.616 INFO MemoryStore - Block broadcast_138 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
19:48:35.623 INFO MemoryStore - Block broadcast_138_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
19:48:35.623 INFO BlockManagerInfo - Added broadcast_138_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:35.623 INFO SparkContext - Created broadcast 138 from newAPIHadoopFile at PathSplitSource.java:96
19:48:35.646 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:35.646 INFO DAGScheduler - Got job 57 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:35.646 INFO DAGScheduler - Final stage: ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222)
19:48:35.646 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 76)
19:48:35.646 INFO DAGScheduler - Missing parents: List()
19:48:35.646 INFO DAGScheduler - Submitting ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:35.647 INFO MemoryStore - Block broadcast_139 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
19:48:35.648 INFO MemoryStore - Block broadcast_139_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
19:48:35.648 INFO BlockManagerInfo - Added broadcast_139_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.7 MiB)
19:48:35.648 INFO SparkContext - Created broadcast 139 from broadcast at DAGScheduler.scala:1580
19:48:35.648 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:35.648 INFO TaskSchedulerImpl - Adding task set 77.0 with 2 tasks resource profile 0
19:48:35.649 INFO TaskSetManager - Starting task 0.0 in stage 77.0 (TID 115) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:35.649 INFO TaskSetManager - Starting task 1.0 in stage 77.0 (TID 116) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:35.650 INFO Executor - Running task 1.0 in stage 77.0 (TID 116)
19:48:35.650 INFO Executor - Running task 0.0 in stage 77.0 (TID 115)
19:48:35.652 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:35.652 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:35.652 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:35.652 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:35.656 INFO Executor - Finished task 0.0 in stage 77.0 (TID 115). 1634 bytes result sent to driver
19:48:35.657 INFO TaskSetManager - Finished task 0.0 in stage 77.0 (TID 115) in 8 ms on localhost (executor driver) (1/2)
19:48:35.657 INFO Executor - Finished task 1.0 in stage 77.0 (TID 116). 1634 bytes result sent to driver
19:48:35.657 INFO TaskSetManager - Finished task 1.0 in stage 77.0 (TID 116) in 8 ms on localhost (executor driver) (2/2)
19:48:35.657 INFO TaskSchedulerImpl - Removed TaskSet 77.0, whose tasks have all completed, from pool
19:48:35.657 INFO DAGScheduler - ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
19:48:35.657 INFO DAGScheduler - Job 57 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.657 INFO TaskSchedulerImpl - Killing all running tasks in stage 77: Stage finished
19:48:35.657 INFO DAGScheduler - Job 57 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011704 s
19:48:35.670 INFO FileInputFormat - Total input files to process : 2
19:48:35.674 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:35.674 INFO DAGScheduler - Got job 58 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:35.674 INFO DAGScheduler - Final stage: ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222)
19:48:35.674 INFO DAGScheduler - Parents of final stage: List()
19:48:35.674 INFO DAGScheduler - Missing parents: List()
19:48:35.675 INFO DAGScheduler - Submitting ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:35.701 INFO MemoryStore - Block broadcast_140 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
19:48:35.702 INFO MemoryStore - Block broadcast_140_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
19:48:35.703 INFO BlockManagerInfo - Added broadcast_140_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:35.703 INFO SparkContext - Created broadcast 140 from broadcast at DAGScheduler.scala:1580
19:48:35.703 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:35.703 INFO TaskSchedulerImpl - Adding task set 78.0 with 2 tasks resource profile 0
19:48:35.704 INFO TaskSetManager - Starting task 0.0 in stage 78.0 (TID 117) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:35.704 INFO TaskSetManager - Starting task 1.0 in stage 78.0 (TID 118) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:35.704 INFO Executor - Running task 0.0 in stage 78.0 (TID 117)
19:48:35.704 INFO Executor - Running task 1.0 in stage 78.0 (TID 118)
19:48:35.734 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111040959680649232863.bam/part-r-00000.bam:0+132492
19:48:35.744 INFO Executor - Finished task 1.0 in stage 78.0 (TID 118). 989 bytes result sent to driver
19:48:35.744 INFO TaskSetManager - Finished task 1.0 in stage 78.0 (TID 118) in 40 ms on localhost (executor driver) (1/2)
19:48:35.748 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111040959680649232863.bam/part-r-00001.bam:0+129330
19:48:35.761 INFO Executor - Finished task 0.0 in stage 78.0 (TID 117). 989 bytes result sent to driver
19:48:35.761 INFO TaskSetManager - Finished task 0.0 in stage 78.0 (TID 117) in 57 ms on localhost (executor driver) (2/2)
19:48:35.761 INFO TaskSchedulerImpl - Removed TaskSet 78.0, whose tasks have all completed, from pool
19:48:35.762 INFO DAGScheduler - ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.087 s
19:48:35.762 INFO DAGScheduler - Job 58 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.762 INFO TaskSchedulerImpl - Killing all running tasks in stage 78: Stage finished
19:48:35.762 INFO DAGScheduler - Job 58 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.087917 s
19:48:35.766 INFO MemoryStore - Block broadcast_141 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
19:48:35.776 INFO MemoryStore - Block broadcast_141_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
19:48:35.776 INFO BlockManagerInfo - Added broadcast_141_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:35.777 INFO SparkContext - Created broadcast 141 from newAPIHadoopFile at PathSplitSource.java:96
19:48:35.801 INFO MemoryStore - Block broadcast_142 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
19:48:35.812 INFO MemoryStore - Block broadcast_142_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
19:48:35.812 INFO BlockManagerInfo - Added broadcast_142_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:35.812 INFO SparkContext - Created broadcast 142 from newAPIHadoopFile at PathSplitSource.java:96
19:48:35.842 INFO MemoryStore - Block broadcast_143 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
19:48:35.843 INFO MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
19:48:35.843 INFO BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:48:35.843 INFO SparkContext - Created broadcast 143 from broadcast at ReadsSparkSink.java:133
19:48:35.845 INFO MemoryStore - Block broadcast_144 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
19:48:35.846 INFO MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
19:48:35.846 INFO BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:48:35.846 INFO SparkContext - Created broadcast 144 from broadcast at AnySamSinkMultiple.java:80
19:48:35.849 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.849 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.849 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.863 INFO FileInputFormat - Total input files to process : 1
19:48:35.871 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:35.872 INFO DAGScheduler - Registering RDD 323 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 17
19:48:35.872 INFO DAGScheduler - Got job 59 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:35.872 INFO DAGScheduler - Final stage: ResultStage 80 (runJob at SparkHadoopWriter.scala:83)
19:48:35.872 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 79)
19:48:35.872 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 79)
19:48:35.872 INFO DAGScheduler - Submitting ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:35.896 INFO MemoryStore - Block broadcast_145 stored as values in memory (estimated size 427.7 KiB, free 1916.2 MiB)
19:48:35.898 INFO MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.1 MiB)
19:48:35.898 INFO BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.3 MiB)
19:48:35.898 INFO SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1580
19:48:35.898 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:35.898 INFO TaskSchedulerImpl - Adding task set 79.0 with 1 tasks resource profile 0
19:48:35.899 INFO TaskSetManager - Starting task 0.0 in stage 79.0 (TID 119) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:35.900 INFO Executor - Running task 0.0 in stage 79.0 (TID 119)
19:48:35.936 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:35.955 INFO Executor - Finished task 0.0 in stage 79.0 (TID 119). 1149 bytes result sent to driver
19:48:35.956 INFO TaskSetManager - Finished task 0.0 in stage 79.0 (TID 119) in 57 ms on localhost (executor driver) (1/1)
19:48:35.956 INFO TaskSchedulerImpl - Removed TaskSet 79.0, whose tasks have all completed, from pool
19:48:35.956 INFO DAGScheduler - ShuffleMapStage 79 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.083 s
19:48:35.956 INFO DAGScheduler - looking for newly runnable stages
19:48:35.956 INFO DAGScheduler - running: HashSet()
19:48:35.956 INFO DAGScheduler - waiting: HashSet(ResultStage 80)
19:48:35.956 INFO DAGScheduler - failed: HashSet()
19:48:35.956 INFO DAGScheduler - Submitting ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:35.963 INFO MemoryStore - Block broadcast_146 stored as values in memory (estimated size 150.2 KiB, free 1915.9 MiB)
19:48:35.964 INFO MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1915.9 MiB)
19:48:35.964 INFO BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:36125 (size: 56.2 KiB, free: 1919.2 MiB)
19:48:35.964 INFO SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1580
19:48:35.964 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:35.964 INFO TaskSchedulerImpl - Adding task set 80.0 with 2 tasks resource profile 0
19:48:35.965 INFO TaskSetManager - Starting task 0.0 in stage 80.0 (TID 120) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:35.965 INFO TaskSetManager - Starting task 1.0 in stage 80.0 (TID 121) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:35.965 INFO Executor - Running task 0.0 in stage 80.0 (TID 120)
19:48:35.965 INFO Executor - Running task 1.0 in stage 80.0 (TID 121)
19:48:35.970 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.970 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.970 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.970 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.970 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.970 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.972 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.972 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.972 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.972 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:35.972 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:35.972 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:35.984 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:35.984 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:35.986 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:35.986 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:35.993 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194835415172415818793169_0335_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest113688951973011312231.bam/_temporary/0/task_20250715194835415172415818793169_0335_r_000001
19:48:35.993 INFO SparkHadoopMapRedUtil - attempt_20250715194835415172415818793169_0335_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:35.994 INFO Executor - Finished task 1.0 in stage 80.0 (TID 121). 1729 bytes result sent to driver
19:48:35.994 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194835415172415818793169_0335_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest113688951973011312231.bam/_temporary/0/task_20250715194835415172415818793169_0335_r_000000
19:48:35.994 INFO SparkHadoopMapRedUtil - attempt_20250715194835415172415818793169_0335_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:35.994 INFO TaskSetManager - Finished task 1.0 in stage 80.0 (TID 121) in 29 ms on localhost (executor driver) (1/2)
19:48:35.994 INFO Executor - Finished task 0.0 in stage 80.0 (TID 120). 1729 bytes result sent to driver
19:48:35.995 INFO TaskSetManager - Finished task 0.0 in stage 80.0 (TID 120) in 30 ms on localhost (executor driver) (2/2)
19:48:35.995 INFO TaskSchedulerImpl - Removed TaskSet 80.0, whose tasks have all completed, from pool
19:48:35.995 INFO DAGScheduler - ResultStage 80 (runJob at SparkHadoopWriter.scala:83) finished in 0.038 s
19:48:35.995 INFO DAGScheduler - Job 59 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:35.995 INFO TaskSchedulerImpl - Killing all running tasks in stage 80: Stage finished
19:48:35.995 INFO DAGScheduler - Job 59 finished: runJob at SparkHadoopWriter.scala:83, took 0.123836 s
19:48:35.996 INFO SparkHadoopWriter - Start to commit write Job job_20250715194835415172415818793169_0335.
19:48:36.001 INFO SparkHadoopWriter - Write Job job_20250715194835415172415818793169_0335 committed. Elapsed time: 5 ms.
19:48:36.003 INFO MemoryStore - Block broadcast_147 stored as values in memory (estimated size 297.9 KiB, free 1915.6 MiB)
19:48:36.013 INFO BlockManagerInfo - Removed broadcast_139_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.2 MiB)
19:48:36.013 INFO BlockManagerInfo - Removed broadcast_135_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.2 MiB)
19:48:36.014 INFO BlockManagerInfo - Removed broadcast_138_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:36.014 INFO BlockManagerInfo - Removed broadcast_134_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:48:36.016 INFO BlockManagerInfo - Removed broadcast_142_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:36.016 INFO BlockManagerInfo - Removed broadcast_146_piece0 on localhost:36125 in memory (size: 56.2 KiB, free: 1919.4 MiB)
19:48:36.018 INFO BlockManagerInfo - Removed broadcast_144_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:36.018 INFO BlockManagerInfo - Removed broadcast_145_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.5 MiB)
19:48:36.019 INFO MemoryStore - Block broadcast_147_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
19:48:36.019 INFO BlockManagerInfo - Added broadcast_147_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:36.019 INFO BlockManagerInfo - Removed broadcast_136_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.6 MiB)
19:48:36.019 INFO SparkContext - Created broadcast 147 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.020 INFO BlockManagerInfo - Removed broadcast_140_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:36.020 INFO BlockManagerInfo - Removed broadcast_132_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:36.021 INFO BlockManagerInfo - Removed broadcast_143_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.8 MiB)
19:48:36.022 INFO BlockManagerInfo - Removed broadcast_137_piece0 on localhost:36125 in memory (size: 56.3 KiB, free: 1919.9 MiB)
19:48:36.043 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:36.044 INFO DAGScheduler - Got job 60 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:36.044 INFO DAGScheduler - Final stage: ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222)
19:48:36.044 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 81)
19:48:36.044 INFO DAGScheduler - Missing parents: List()
19:48:36.045 INFO DAGScheduler - Submitting ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:36.046 INFO MemoryStore - Block broadcast_148 stored as values in memory (estimated size 6.3 KiB, free 1919.3 MiB)
19:48:36.047 INFO MemoryStore - Block broadcast_148_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1919.3 MiB)
19:48:36.047 INFO BlockManagerInfo - Added broadcast_148_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.9 MiB)
19:48:36.047 INFO SparkContext - Created broadcast 148 from broadcast at DAGScheduler.scala:1580
19:48:36.047 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.047 INFO TaskSchedulerImpl - Adding task set 82.0 with 2 tasks resource profile 0
19:48:36.048 INFO TaskSetManager - Starting task 0.0 in stage 82.0 (TID 122) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:36.048 INFO TaskSetManager - Starting task 1.0 in stage 82.0 (TID 123) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:36.049 INFO Executor - Running task 0.0 in stage 82.0 (TID 122)
19:48:36.049 INFO Executor - Running task 1.0 in stage 82.0 (TID 123)
19:48:36.051 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.051 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.051 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.051 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.056 INFO Executor - Finished task 1.0 in stage 82.0 (TID 123). 1591 bytes result sent to driver
19:48:36.056 INFO Executor - Finished task 0.0 in stage 82.0 (TID 122). 1634 bytes result sent to driver
19:48:36.056 INFO TaskSetManager - Finished task 1.0 in stage 82.0 (TID 123) in 8 ms on localhost (executor driver) (1/2)
19:48:36.056 INFO TaskSetManager - Finished task 0.0 in stage 82.0 (TID 122) in 8 ms on localhost (executor driver) (2/2)
19:48:36.056 INFO TaskSchedulerImpl - Removed TaskSet 82.0, whose tasks have all completed, from pool
19:48:36.057 INFO DAGScheduler - ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.012 s
19:48:36.057 INFO DAGScheduler - Job 60 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.057 INFO TaskSchedulerImpl - Killing all running tasks in stage 82: Stage finished
19:48:36.057 INFO DAGScheduler - Job 60 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.013263 s
19:48:36.070 INFO FileInputFormat - Total input files to process : 2
19:48:36.073 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:36.073 INFO DAGScheduler - Got job 61 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:36.073 INFO DAGScheduler - Final stage: ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222)
19:48:36.073 INFO DAGScheduler - Parents of final stage: List()
19:48:36.074 INFO DAGScheduler - Missing parents: List()
19:48:36.074 INFO DAGScheduler - Submitting ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:36.097 INFO MemoryStore - Block broadcast_149 stored as values in memory (estimated size 426.1 KiB, free 1918.9 MiB)
19:48:36.098 INFO MemoryStore - Block broadcast_149_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.7 MiB)
19:48:36.099 INFO BlockManagerInfo - Added broadcast_149_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:36.099 INFO SparkContext - Created broadcast 149 from broadcast at DAGScheduler.scala:1580
19:48:36.099 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.099 INFO TaskSchedulerImpl - Adding task set 83.0 with 2 tasks resource profile 0
19:48:36.100 INFO TaskSetManager - Starting task 0.0 in stage 83.0 (TID 124) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:36.100 INFO TaskSetManager - Starting task 1.0 in stage 83.0 (TID 125) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:36.100 INFO Executor - Running task 0.0 in stage 83.0 (TID 124)
19:48:36.100 INFO Executor - Running task 1.0 in stage 83.0 (TID 125)
19:48:36.132 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest113688951973011312231.bam/part-r-00000.bam:0+132492
19:48:36.142 INFO Executor - Finished task 1.0 in stage 83.0 (TID 125). 989 bytes result sent to driver
19:48:36.142 INFO TaskSetManager - Finished task 1.0 in stage 83.0 (TID 125) in 42 ms on localhost (executor driver) (1/2)
19:48:36.144 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest113688951973011312231.bam/part-r-00001.bam:0+129330
19:48:36.156 INFO Executor - Finished task 0.0 in stage 83.0 (TID 124). 989 bytes result sent to driver
19:48:36.156 INFO TaskSetManager - Finished task 0.0 in stage 83.0 (TID 124) in 56 ms on localhost (executor driver) (2/2)
19:48:36.156 INFO TaskSchedulerImpl - Removed TaskSet 83.0, whose tasks have all completed, from pool
19:48:36.157 INFO DAGScheduler - ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.083 s
19:48:36.157 INFO DAGScheduler - Job 61 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.157 INFO TaskSchedulerImpl - Killing all running tasks in stage 83: Stage finished
19:48:36.157 INFO DAGScheduler - Job 61 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.083827 s
19:48:36.160 INFO MemoryStore - Block broadcast_150 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
19:48:36.166 INFO MemoryStore - Block broadcast_150_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
19:48:36.167 INFO BlockManagerInfo - Added broadcast_150_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:36.167 INFO SparkContext - Created broadcast 150 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.191 INFO MemoryStore - Block broadcast_151 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:36.197 INFO MemoryStore - Block broadcast_151_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
19:48:36.197 INFO BlockManagerInfo - Added broadcast_151_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:36.197 INFO SparkContext - Created broadcast 151 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.217 INFO MemoryStore - Block broadcast_152 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
19:48:36.218 INFO MemoryStore - Block broadcast_152_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
19:48:36.218 INFO BlockManagerInfo - Added broadcast_152_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.6 MiB)
19:48:36.218 INFO SparkContext - Created broadcast 152 from broadcast at ReadsSparkSink.java:133
19:48:36.220 INFO MemoryStore - Block broadcast_153 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
19:48:36.220 INFO MemoryStore - Block broadcast_153_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
19:48:36.221 INFO BlockManagerInfo - Added broadcast_153_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.6 MiB)
19:48:36.221 INFO SparkContext - Created broadcast 153 from broadcast at AnySamSinkMultiple.java:80
19:48:36.222 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.222 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.222 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.234 INFO FileInputFormat - Total input files to process : 1
19:48:36.241 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:36.241 INFO DAGScheduler - Registering RDD 350 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 18
19:48:36.241 INFO DAGScheduler - Got job 62 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:36.241 INFO DAGScheduler - Final stage: ResultStage 85 (runJob at SparkHadoopWriter.scala:83)
19:48:36.241 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 84)
19:48:36.242 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 84)
19:48:36.242 INFO DAGScheduler - Submitting ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:36.259 INFO MemoryStore - Block broadcast_154 stored as values in memory (estimated size 427.7 KiB, free 1917.3 MiB)
19:48:36.260 INFO MemoryStore - Block broadcast_154_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.2 MiB)
19:48:36.260 INFO BlockManagerInfo - Added broadcast_154_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.5 MiB)
19:48:36.261 INFO SparkContext - Created broadcast 154 from broadcast at DAGScheduler.scala:1580
19:48:36.261 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:36.261 INFO TaskSchedulerImpl - Adding task set 84.0 with 1 tasks resource profile 0
19:48:36.261 INFO TaskSetManager - Starting task 0.0 in stage 84.0 (TID 126) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:36.262 INFO Executor - Running task 0.0 in stage 84.0 (TID 126)
19:48:36.292 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:36.311 INFO Executor - Finished task 0.0 in stage 84.0 (TID 126). 1149 bytes result sent to driver
19:48:36.312 INFO TaskSetManager - Finished task 0.0 in stage 84.0 (TID 126) in 51 ms on localhost (executor driver) (1/1)
19:48:36.312 INFO TaskSchedulerImpl - Removed TaskSet 84.0, whose tasks have all completed, from pool
19:48:36.312 INFO DAGScheduler - ShuffleMapStage 84 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.070 s
19:48:36.312 INFO DAGScheduler - looking for newly runnable stages
19:48:36.312 INFO DAGScheduler - running: HashSet()
19:48:36.312 INFO DAGScheduler - waiting: HashSet(ResultStage 85)
19:48:36.312 INFO DAGScheduler - failed: HashSet()
19:48:36.312 INFO DAGScheduler - Submitting ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:36.319 INFO MemoryStore - Block broadcast_155 stored as values in memory (estimated size 150.2 KiB, free 1917.0 MiB)
19:48:36.320 INFO MemoryStore - Block broadcast_155_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1917.0 MiB)
19:48:36.320 INFO BlockManagerInfo - Added broadcast_155_piece0 in memory on localhost:36125 (size: 56.2 KiB, free: 1919.4 MiB)
19:48:36.320 INFO SparkContext - Created broadcast 155 from broadcast at DAGScheduler.scala:1580
19:48:36.320 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.320 INFO TaskSchedulerImpl - Adding task set 85.0 with 2 tasks resource profile 0
19:48:36.321 INFO TaskSetManager - Starting task 0.0 in stage 85.0 (TID 127) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:36.321 INFO TaskSetManager - Starting task 1.0 in stage 85.0 (TID 128) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:36.322 INFO Executor - Running task 0.0 in stage 85.0 (TID 127)
19:48:36.322 INFO Executor - Running task 1.0 in stage 85.0 (TID 128)
19:48:36.326 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.326 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.326 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.327 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.327 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.327 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.328 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.328 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.328 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.329 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.329 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.329 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.341 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.341 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.341 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.341 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.348 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194836263640897639972779_0362_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest113282798255846713362.bam/_temporary/0/task_20250715194836263640897639972779_0362_r_000001
19:48:36.348 INFO SparkHadoopMapRedUtil - attempt_20250715194836263640897639972779_0362_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:36.349 INFO Executor - Finished task 1.0 in stage 85.0 (TID 128). 1729 bytes result sent to driver
19:48:36.349 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194836263640897639972779_0362_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest113282798255846713362.bam/_temporary/0/task_20250715194836263640897639972779_0362_r_000000
19:48:36.349 INFO SparkHadoopMapRedUtil - attempt_20250715194836263640897639972779_0362_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:36.350 INFO TaskSetManager - Finished task 1.0 in stage 85.0 (TID 128) in 28 ms on localhost (executor driver) (1/2)
19:48:36.350 INFO Executor - Finished task 0.0 in stage 85.0 (TID 127). 1729 bytes result sent to driver
19:48:36.350 INFO TaskSetManager - Finished task 0.0 in stage 85.0 (TID 127) in 29 ms on localhost (executor driver) (2/2)
19:48:36.350 INFO TaskSchedulerImpl - Removed TaskSet 85.0, whose tasks have all completed, from pool
19:48:36.350 INFO DAGScheduler - ResultStage 85 (runJob at SparkHadoopWriter.scala:83) finished in 0.037 s
19:48:36.351 INFO DAGScheduler - Job 62 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.351 INFO TaskSchedulerImpl - Killing all running tasks in stage 85: Stage finished
19:48:36.351 INFO DAGScheduler - Job 62 finished: runJob at SparkHadoopWriter.scala:83, took 0.110180 s
19:48:36.351 INFO SparkHadoopWriter - Start to commit write Job job_20250715194836263640897639972779_0362.
19:48:36.357 INFO SparkHadoopWriter - Write Job job_20250715194836263640897639972779_0362 committed. Elapsed time: 6 ms.
19:48:36.361 INFO MemoryStore - Block broadcast_156 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
19:48:36.371 INFO MemoryStore - Block broadcast_156_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
19:48:36.372 INFO BlockManagerInfo - Added broadcast_156_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:36.372 INFO SparkContext - Created broadcast 156 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.408 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:36.409 INFO DAGScheduler - Got job 63 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:36.409 INFO DAGScheduler - Final stage: ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222)
19:48:36.409 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 86)
19:48:36.409 INFO DAGScheduler - Missing parents: List()
19:48:36.409 INFO DAGScheduler - Submitting ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:36.410 INFO MemoryStore - Block broadcast_157 stored as values in memory (estimated size 6.3 KiB, free 1916.6 MiB)
19:48:36.410 INFO MemoryStore - Block broadcast_157_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.6 MiB)
19:48:36.410 INFO BlockManagerInfo - Added broadcast_157_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.4 MiB)
19:48:36.411 INFO SparkContext - Created broadcast 157 from broadcast at DAGScheduler.scala:1580
19:48:36.411 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.411 INFO TaskSchedulerImpl - Adding task set 87.0 with 2 tasks resource profile 0
19:48:36.412 INFO TaskSetManager - Starting task 0.0 in stage 87.0 (TID 129) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:36.412 INFO TaskSetManager - Starting task 1.0 in stage 87.0 (TID 130) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:36.412 INFO Executor - Running task 1.0 in stage 87.0 (TID 130)
19:48:36.412 INFO Executor - Running task 0.0 in stage 87.0 (TID 129)
19:48:36.414 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.414 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.414 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.414 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.419 INFO Executor - Finished task 0.0 in stage 87.0 (TID 129). 1634 bytes result sent to driver
19:48:36.419 INFO TaskSetManager - Finished task 0.0 in stage 87.0 (TID 129) in 8 ms on localhost (executor driver) (1/2)
19:48:36.419 INFO Executor - Finished task 1.0 in stage 87.0 (TID 130). 1634 bytes result sent to driver
19:48:36.419 INFO TaskSetManager - Finished task 1.0 in stage 87.0 (TID 130) in 7 ms on localhost (executor driver) (2/2)
19:48:36.420 INFO TaskSchedulerImpl - Removed TaskSet 87.0, whose tasks have all completed, from pool
19:48:36.420 INFO DAGScheduler - ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.011 s
19:48:36.420 INFO DAGScheduler - Job 63 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.420 INFO TaskSchedulerImpl - Killing all running tasks in stage 87: Stage finished
19:48:36.420 INFO DAGScheduler - Job 63 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011630 s
19:48:36.433 INFO FileInputFormat - Total input files to process : 2
19:48:36.436 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:36.437 INFO DAGScheduler - Got job 64 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:36.437 INFO DAGScheduler - Final stage: ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222)
19:48:36.437 INFO DAGScheduler - Parents of final stage: List()
19:48:36.437 INFO DAGScheduler - Missing parents: List()
19:48:36.437 INFO DAGScheduler - Submitting ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:36.458 INFO MemoryStore - Block broadcast_158 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
19:48:36.459 INFO MemoryStore - Block broadcast_158_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.0 MiB)
19:48:36.459 INFO BlockManagerInfo - Added broadcast_158_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.2 MiB)
19:48:36.459 INFO SparkContext - Created broadcast 158 from broadcast at DAGScheduler.scala:1580
19:48:36.459 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.460 INFO TaskSchedulerImpl - Adding task set 88.0 with 2 tasks resource profile 0
19:48:36.460 INFO TaskSetManager - Starting task 0.0 in stage 88.0 (TID 131) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:36.460 INFO TaskSetManager - Starting task 1.0 in stage 88.0 (TID 132) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:36.461 INFO Executor - Running task 0.0 in stage 88.0 (TID 131)
19:48:36.461 INFO Executor - Running task 1.0 in stage 88.0 (TID 132)
19:48:36.499 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest113282798255846713362.bam/part-r-00001.bam:0+129330
19:48:36.504 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest113282798255846713362.bam/part-r-00000.bam:0+132492
19:48:36.508 INFO Executor - Finished task 0.0 in stage 88.0 (TID 131). 989 bytes result sent to driver
19:48:36.509 INFO TaskSetManager - Finished task 0.0 in stage 88.0 (TID 131) in 49 ms on localhost (executor driver) (1/2)
19:48:36.523 INFO Executor - Finished task 1.0 in stage 88.0 (TID 132). 1075 bytes result sent to driver
19:48:36.524 INFO BlockManagerInfo - Removed broadcast_154_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.4 MiB)
19:48:36.524 INFO TaskSetManager - Finished task 1.0 in stage 88.0 (TID 132) in 64 ms on localhost (executor driver) (2/2)
19:48:36.524 INFO TaskSchedulerImpl - Removed TaskSet 88.0, whose tasks have all completed, from pool
19:48:36.524 INFO DAGScheduler - ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.087 s
19:48:36.524 INFO DAGScheduler - Job 64 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.524 INFO TaskSchedulerImpl - Killing all running tasks in stage 88: Stage finished
19:48:36.525 INFO DAGScheduler - Job 64 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.088335 s
19:48:36.525 INFO BlockManagerInfo - Removed broadcast_141_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:36.526 INFO BlockManagerInfo - Removed broadcast_153_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:36.526 INFO BlockManagerInfo - Removed broadcast_157_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.4 MiB)
19:48:36.527 INFO BlockManagerInfo - Removed broadcast_151_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:36.528 INFO BlockManagerInfo - Removed broadcast_155_piece0 on localhost:36125 in memory (size: 56.2 KiB, free: 1919.5 MiB)
19:48:36.528 INFO BlockManagerInfo - Removed broadcast_152_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:36.529 INFO MemoryStore - Block broadcast_159 stored as values in memory (estimated size 297.9 KiB, free 1917.6 MiB)
19:48:36.529 INFO BlockManagerInfo - Removed broadcast_147_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:36.530 INFO BlockManagerInfo - Removed broadcast_148_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.6 MiB)
19:48:36.530 INFO BlockManagerInfo - Removed broadcast_149_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:36.536 INFO MemoryStore - Block broadcast_159_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
19:48:36.536 INFO BlockManagerInfo - Added broadcast_159_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:36.537 INFO SparkContext - Created broadcast 159 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.560 INFO MemoryStore - Block broadcast_160 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:36.566 INFO MemoryStore - Block broadcast_160_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
19:48:36.566 INFO BlockManagerInfo - Added broadcast_160_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:36.567 INFO SparkContext - Created broadcast 160 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.586 INFO MemoryStore - Block broadcast_161 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
19:48:36.587 INFO MemoryStore - Block broadcast_161_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
19:48:36.587 INFO BlockManagerInfo - Added broadcast_161_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.6 MiB)
19:48:36.588 INFO SparkContext - Created broadcast 161 from broadcast at ReadsSparkSink.java:133
19:48:36.589 INFO MemoryStore - Block broadcast_162 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
19:48:36.590 INFO MemoryStore - Block broadcast_162_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
19:48:36.590 INFO BlockManagerInfo - Added broadcast_162_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.6 MiB)
19:48:36.590 INFO SparkContext - Created broadcast 162 from broadcast at AnySamSinkMultiple.java:80
19:48:36.592 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.592 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.592 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.604 INFO FileInputFormat - Total input files to process : 1
19:48:36.610 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:36.611 INFO DAGScheduler - Registering RDD 377 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 19
19:48:36.611 INFO DAGScheduler - Got job 65 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:36.611 INFO DAGScheduler - Final stage: ResultStage 90 (runJob at SparkHadoopWriter.scala:83)
19:48:36.611 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 89)
19:48:36.611 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 89)
19:48:36.611 INFO DAGScheduler - Submitting ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:36.628 INFO MemoryStore - Block broadcast_163 stored as values in memory (estimated size 427.7 KiB, free 1917.3 MiB)
19:48:36.630 INFO MemoryStore - Block broadcast_163_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.2 MiB)
19:48:36.630 INFO BlockManagerInfo - Added broadcast_163_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.5 MiB)
19:48:36.630 INFO SparkContext - Created broadcast 163 from broadcast at DAGScheduler.scala:1580
19:48:36.630 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:36.630 INFO TaskSchedulerImpl - Adding task set 89.0 with 1 tasks resource profile 0
19:48:36.631 INFO TaskSetManager - Starting task 0.0 in stage 89.0 (TID 133) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:36.631 INFO Executor - Running task 0.0 in stage 89.0 (TID 133)
19:48:36.662 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:36.680 INFO Executor - Finished task 0.0 in stage 89.0 (TID 133). 1149 bytes result sent to driver
19:48:36.681 INFO TaskSetManager - Finished task 0.0 in stage 89.0 (TID 133) in 50 ms on localhost (executor driver) (1/1)
19:48:36.681 INFO TaskSchedulerImpl - Removed TaskSet 89.0, whose tasks have all completed, from pool
19:48:36.681 INFO DAGScheduler - ShuffleMapStage 89 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.069 s
19:48:36.681 INFO DAGScheduler - looking for newly runnable stages
19:48:36.681 INFO DAGScheduler - running: HashSet()
19:48:36.681 INFO DAGScheduler - waiting: HashSet(ResultStage 90)
19:48:36.681 INFO DAGScheduler - failed: HashSet()
19:48:36.681 INFO DAGScheduler - Submitting ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:36.688 INFO MemoryStore - Block broadcast_164 stored as values in memory (estimated size 150.2 KiB, free 1917.0 MiB)
19:48:36.688 INFO MemoryStore - Block broadcast_164_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1917.0 MiB)
19:48:36.689 INFO BlockManagerInfo - Added broadcast_164_piece0 in memory on localhost:36125 (size: 56.3 KiB, free: 1919.4 MiB)
19:48:36.689 INFO SparkContext - Created broadcast 164 from broadcast at DAGScheduler.scala:1580
19:48:36.689 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.689 INFO TaskSchedulerImpl - Adding task set 90.0 with 2 tasks resource profile 0
19:48:36.690 INFO TaskSetManager - Starting task 0.0 in stage 90.0 (TID 134) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:36.690 INFO TaskSetManager - Starting task 1.0 in stage 90.0 (TID 135) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:36.690 INFO Executor - Running task 0.0 in stage 90.0 (TID 134)
19:48:36.690 INFO Executor - Running task 1.0 in stage 90.0 (TID 135)
19:48:36.695 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.695 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.695 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.695 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.695 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.695 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.697 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.697 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.697 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.697 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.697 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.697 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.709 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.709 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.711 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.711 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.717 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948367072702141045191248_0389_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest117422716839149950971.bam/_temporary/0/task_202507151948367072702141045191248_0389_r_000000
19:48:36.717 INFO SparkHadoopMapRedUtil - attempt_202507151948367072702141045191248_0389_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:36.717 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948367072702141045191248_0389_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest117422716839149950971.bam/_temporary/0/task_202507151948367072702141045191248_0389_r_000001
19:48:36.717 INFO SparkHadoopMapRedUtil - attempt_202507151948367072702141045191248_0389_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:36.718 INFO Executor - Finished task 0.0 in stage 90.0 (TID 134). 1729 bytes result sent to driver
19:48:36.718 INFO Executor - Finished task 1.0 in stage 90.0 (TID 135). 1729 bytes result sent to driver
19:48:36.718 INFO TaskSetManager - Finished task 0.0 in stage 90.0 (TID 134) in 28 ms on localhost (executor driver) (1/2)
19:48:36.718 INFO TaskSetManager - Finished task 1.0 in stage 90.0 (TID 135) in 28 ms on localhost (executor driver) (2/2)
19:48:36.718 INFO TaskSchedulerImpl - Removed TaskSet 90.0, whose tasks have all completed, from pool
19:48:36.719 INFO DAGScheduler - ResultStage 90 (runJob at SparkHadoopWriter.scala:83) finished in 0.037 s
19:48:36.719 INFO DAGScheduler - Job 65 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.719 INFO TaskSchedulerImpl - Killing all running tasks in stage 90: Stage finished
19:48:36.719 INFO DAGScheduler - Job 65 finished: runJob at SparkHadoopWriter.scala:83, took 0.108609 s
19:48:36.719 INFO SparkHadoopWriter - Start to commit write Job job_202507151948367072702141045191248_0389.
19:48:36.724 INFO SparkHadoopWriter - Write Job job_202507151948367072702141045191248_0389 committed. Elapsed time: 4 ms.
19:48:36.727 INFO MemoryStore - Block broadcast_165 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
19:48:36.734 INFO MemoryStore - Block broadcast_165_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
19:48:36.734 INFO BlockManagerInfo - Added broadcast_165_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:36.734 INFO SparkContext - Created broadcast 165 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.758 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:36.759 INFO DAGScheduler - Got job 66 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:36.759 INFO DAGScheduler - Final stage: ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222)
19:48:36.759 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 91)
19:48:36.759 INFO DAGScheduler - Missing parents: List()
19:48:36.759 INFO DAGScheduler - Submitting ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:36.760 INFO MemoryStore - Block broadcast_166 stored as values in memory (estimated size 6.3 KiB, free 1916.6 MiB)
19:48:36.760 INFO MemoryStore - Block broadcast_166_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.6 MiB)
19:48:36.760 INFO BlockManagerInfo - Added broadcast_166_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.4 MiB)
19:48:36.761 INFO SparkContext - Created broadcast 166 from broadcast at DAGScheduler.scala:1580
19:48:36.761 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.761 INFO TaskSchedulerImpl - Adding task set 92.0 with 2 tasks resource profile 0
19:48:36.762 INFO TaskSetManager - Starting task 0.0 in stage 92.0 (TID 136) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:36.762 INFO TaskSetManager - Starting task 1.0 in stage 92.0 (TID 137) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:36.762 INFO Executor - Running task 1.0 in stage 92.0 (TID 137)
19:48:36.762 INFO Executor - Running task 0.0 in stage 92.0 (TID 136)
19:48:36.764 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.764 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:36.764 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.764 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:36.768 INFO Executor - Finished task 1.0 in stage 92.0 (TID 137). 1634 bytes result sent to driver
19:48:36.768 INFO TaskSetManager - Finished task 1.0 in stage 92.0 (TID 137) in 6 ms on localhost (executor driver) (1/2)
19:48:36.768 INFO Executor - Finished task 0.0 in stage 92.0 (TID 136). 1634 bytes result sent to driver
19:48:36.769 INFO TaskSetManager - Finished task 0.0 in stage 92.0 (TID 136) in 8 ms on localhost (executor driver) (2/2)
19:48:36.769 INFO TaskSchedulerImpl - Removed TaskSet 92.0, whose tasks have all completed, from pool
19:48:36.769 INFO DAGScheduler - ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
19:48:36.769 INFO DAGScheduler - Job 66 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.769 INFO TaskSchedulerImpl - Killing all running tasks in stage 92: Stage finished
19:48:36.769 INFO DAGScheduler - Job 66 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010697 s
19:48:36.781 INFO FileInputFormat - Total input files to process : 2
19:48:36.786 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:36.786 INFO DAGScheduler - Got job 67 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:36.786 INFO DAGScheduler - Final stage: ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222)
19:48:36.786 INFO DAGScheduler - Parents of final stage: List()
19:48:36.786 INFO DAGScheduler - Missing parents: List()
19:48:36.786 INFO DAGScheduler - Submitting ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:36.812 INFO MemoryStore - Block broadcast_167 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
19:48:36.814 INFO MemoryStore - Block broadcast_167_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.1 MiB)
19:48:36.814 INFO BlockManagerInfo - Added broadcast_167_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.2 MiB)
19:48:36.814 INFO SparkContext - Created broadcast 167 from broadcast at DAGScheduler.scala:1580
19:48:36.815 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:36.815 INFO TaskSchedulerImpl - Adding task set 93.0 with 2 tasks resource profile 0
19:48:36.815 INFO TaskSetManager - Starting task 0.0 in stage 93.0 (TID 138) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:36.815 INFO TaskSetManager - Starting task 1.0 in stage 93.0 (TID 139) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:36.816 INFO Executor - Running task 1.0 in stage 93.0 (TID 139)
19:48:36.816 INFO Executor - Running task 0.0 in stage 93.0 (TID 138)
19:48:36.845 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117422716839149950971.bam/part-r-00000.bam:0+132492
19:48:36.855 INFO Executor - Finished task 1.0 in stage 93.0 (TID 139). 989 bytes result sent to driver
19:48:36.855 INFO TaskSetManager - Finished task 1.0 in stage 93.0 (TID 139) in 40 ms on localhost (executor driver) (1/2)
19:48:36.860 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117422716839149950971.bam/part-r-00001.bam:0+129330
19:48:36.872 INFO Executor - Finished task 0.0 in stage 93.0 (TID 138). 989 bytes result sent to driver
19:48:36.872 INFO TaskSetManager - Finished task 0.0 in stage 93.0 (TID 138) in 57 ms on localhost (executor driver) (2/2)
19:48:36.872 INFO TaskSchedulerImpl - Removed TaskSet 93.0, whose tasks have all completed, from pool
19:48:36.872 INFO DAGScheduler - ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.085 s
19:48:36.872 INFO DAGScheduler - Job 67 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:36.872 INFO TaskSchedulerImpl - Killing all running tasks in stage 93: Stage finished
19:48:36.873 INFO DAGScheduler - Job 67 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.086837 s
19:48:36.876 INFO MemoryStore - Block broadcast_168 stored as values in memory (estimated size 297.9 KiB, free 1915.8 MiB)
19:48:36.885 INFO MemoryStore - Block broadcast_168_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.7 MiB)
19:48:36.886 INFO BlockManagerInfo - Added broadcast_168_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:36.886 INFO SparkContext - Created broadcast 168 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.920 INFO MemoryStore - Block broadcast_169 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
19:48:36.926 INFO MemoryStore - Block broadcast_169_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.4 MiB)
19:48:36.926 INFO BlockManagerInfo - Added broadcast_169_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.1 MiB)
19:48:36.927 INFO SparkContext - Created broadcast 169 from newAPIHadoopFile at PathSplitSource.java:96
19:48:36.947 INFO MemoryStore - Block broadcast_170 stored as values in memory (estimated size 160.7 KiB, free 1915.2 MiB)
19:48:36.948 INFO MemoryStore - Block broadcast_170_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.2 MiB)
19:48:36.948 INFO BlockManagerInfo - Added broadcast_170_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:36.948 INFO SparkContext - Created broadcast 170 from broadcast at ReadsSparkSink.java:133
19:48:36.950 INFO MemoryStore - Block broadcast_171 stored as values in memory (estimated size 163.2 KiB, free 1915.0 MiB)
19:48:36.959 INFO BlockManagerInfo - Removed broadcast_156_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:36.959 INFO MemoryStore - Block broadcast_171_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.4 MiB)
19:48:36.959 INFO BlockManagerInfo - Added broadcast_171_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.2 MiB)
19:48:36.960 INFO SparkContext - Created broadcast 171 from broadcast at AnySamSinkMultiple.java:80
19:48:36.960 INFO BlockManagerInfo - Removed broadcast_158_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:36.961 INFO BlockManagerInfo - Removed broadcast_166_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.3 MiB)
19:48:36.961 INFO BlockManagerInfo - Removed broadcast_169_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:36.962 INFO BlockManagerInfo - Removed broadcast_167_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:36.962 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:36.962 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:36.962 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:36.963 INFO BlockManagerInfo - Removed broadcast_164_piece0 on localhost:36125 in memory (size: 56.3 KiB, free: 1919.6 MiB)
19:48:36.964 INFO BlockManagerInfo - Removed broadcast_150_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:36.964 INFO BlockManagerInfo - Removed broadcast_162_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:36.965 INFO BlockManagerInfo - Removed broadcast_163_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.8 MiB)
19:48:36.965 INFO BlockManagerInfo - Removed broadcast_160_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:36.966 INFO BlockManagerInfo - Removed broadcast_159_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:36.967 INFO BlockManagerInfo - Removed broadcast_165_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:36.968 INFO BlockManagerInfo - Removed broadcast_161_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:36.976 INFO FileInputFormat - Total input files to process : 1
19:48:36.982 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:36.982 INFO DAGScheduler - Registering RDD 404 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 20
19:48:36.983 INFO DAGScheduler - Got job 68 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:36.983 INFO DAGScheduler - Final stage: ResultStage 95 (runJob at SparkHadoopWriter.scala:83)
19:48:36.983 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 94)
19:48:36.983 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 94)
19:48:36.983 INFO DAGScheduler - Submitting ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:37.002 INFO MemoryStore - Block broadcast_172 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
19:48:37.003 INFO MemoryStore - Block broadcast_172_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.8 MiB)
19:48:37.003 INFO BlockManagerInfo - Added broadcast_172_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.8 MiB)
19:48:37.004 INFO SparkContext - Created broadcast 172 from broadcast at DAGScheduler.scala:1580
19:48:37.004 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:37.004 INFO TaskSchedulerImpl - Adding task set 94.0 with 1 tasks resource profile 0
19:48:37.004 INFO TaskSetManager - Starting task 0.0 in stage 94.0 (TID 140) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:37.005 INFO Executor - Running task 0.0 in stage 94.0 (TID 140)
19:48:37.035 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:37.052 INFO Executor - Finished task 0.0 in stage 94.0 (TID 140). 1149 bytes result sent to driver
19:48:37.052 INFO TaskSetManager - Finished task 0.0 in stage 94.0 (TID 140) in 48 ms on localhost (executor driver) (1/1)
19:48:37.053 INFO TaskSchedulerImpl - Removed TaskSet 94.0, whose tasks have all completed, from pool
19:48:37.053 INFO DAGScheduler - ShuffleMapStage 94 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.070 s
19:48:37.053 INFO DAGScheduler - looking for newly runnable stages
19:48:37.053 INFO DAGScheduler - running: HashSet()
19:48:37.053 INFO DAGScheduler - waiting: HashSet(ResultStage 95)
19:48:37.053 INFO DAGScheduler - failed: HashSet()
19:48:37.053 INFO DAGScheduler - Submitting ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:37.059 INFO MemoryStore - Block broadcast_173 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
19:48:37.060 INFO MemoryStore - Block broadcast_173_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1918.6 MiB)
19:48:37.060 INFO BlockManagerInfo - Added broadcast_173_piece0 in memory on localhost:36125 (size: 56.3 KiB, free: 1919.7 MiB)
19:48:37.060 INFO SparkContext - Created broadcast 173 from broadcast at DAGScheduler.scala:1580
19:48:37.061 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.061 INFO TaskSchedulerImpl - Adding task set 95.0 with 2 tasks resource profile 0
19:48:37.061 INFO TaskSetManager - Starting task 0.0 in stage 95.0 (TID 141) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:37.061 INFO TaskSetManager - Starting task 1.0 in stage 95.0 (TID 142) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:37.062 INFO Executor - Running task 0.0 in stage 95.0 (TID 141)
19:48:37.062 INFO Executor - Running task 1.0 in stage 95.0 (TID 142)
19:48:37.068 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.068 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.068 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.068 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.068 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.068 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.068 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.068 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.068 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.068 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.068 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.068 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.078 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.078 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.084 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.084 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.086 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948366955714559165623975_0416_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest115207248422026731371.bam/_temporary/0/task_202507151948366955714559165623975_0416_r_000001
19:48:37.086 INFO SparkHadoopMapRedUtil - attempt_202507151948366955714559165623975_0416_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:37.086 INFO Executor - Finished task 1.0 in stage 95.0 (TID 142). 1729 bytes result sent to driver
19:48:37.087 INFO TaskSetManager - Finished task 1.0 in stage 95.0 (TID 142) in 26 ms on localhost (executor driver) (1/2)
19:48:37.092 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948366955714559165623975_0416_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest115207248422026731371.bam/_temporary/0/task_202507151948366955714559165623975_0416_r_000000
19:48:37.092 INFO SparkHadoopMapRedUtil - attempt_202507151948366955714559165623975_0416_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:37.092 INFO Executor - Finished task 0.0 in stage 95.0 (TID 141). 1729 bytes result sent to driver
19:48:37.092 INFO TaskSetManager - Finished task 0.0 in stage 95.0 (TID 141) in 31 ms on localhost (executor driver) (2/2)
19:48:37.092 INFO TaskSchedulerImpl - Removed TaskSet 95.0, whose tasks have all completed, from pool
19:48:37.093 INFO DAGScheduler - ResultStage 95 (runJob at SparkHadoopWriter.scala:83) finished in 0.039 s
19:48:37.093 INFO DAGScheduler - Job 68 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.093 INFO TaskSchedulerImpl - Killing all running tasks in stage 95: Stage finished
19:48:37.093 INFO DAGScheduler - Job 68 finished: runJob at SparkHadoopWriter.scala:83, took 0.110917 s
19:48:37.093 INFO SparkHadoopWriter - Start to commit write Job job_202507151948366955714559165623975_0416.
19:48:37.098 INFO SparkHadoopWriter - Write Job job_202507151948366955714559165623975_0416 committed. Elapsed time: 5 ms.
19:48:37.101 INFO MemoryStore - Block broadcast_174 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
19:48:37.110 INFO MemoryStore - Block broadcast_174_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
19:48:37.110 INFO BlockManagerInfo - Added broadcast_174_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:37.111 INFO SparkContext - Created broadcast 174 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.133 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:37.134 INFO DAGScheduler - Got job 69 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:37.134 INFO DAGScheduler - Final stage: ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222)
19:48:37.134 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 96)
19:48:37.134 INFO DAGScheduler - Missing parents: List()
19:48:37.134 INFO DAGScheduler - Submitting ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:37.135 INFO MemoryStore - Block broadcast_175 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
19:48:37.135 INFO MemoryStore - Block broadcast_175_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
19:48:37.136 INFO BlockManagerInfo - Added broadcast_175_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.7 MiB)
19:48:37.136 INFO SparkContext - Created broadcast 175 from broadcast at DAGScheduler.scala:1580
19:48:37.136 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.136 INFO TaskSchedulerImpl - Adding task set 97.0 with 2 tasks resource profile 0
19:48:37.137 INFO TaskSetManager - Starting task 0.0 in stage 97.0 (TID 143) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:37.137 INFO TaskSetManager - Starting task 1.0 in stage 97.0 (TID 144) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:37.137 INFO Executor - Running task 0.0 in stage 97.0 (TID 143)
19:48:37.137 INFO Executor - Running task 1.0 in stage 97.0 (TID 144)
19:48:37.139 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.139 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.139 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.139 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.143 INFO Executor - Finished task 1.0 in stage 97.0 (TID 144). 1634 bytes result sent to driver
19:48:37.143 INFO TaskSetManager - Finished task 1.0 in stage 97.0 (TID 144) in 6 ms on localhost (executor driver) (1/2)
19:48:37.144 INFO Executor - Finished task 0.0 in stage 97.0 (TID 143). 1634 bytes result sent to driver
19:48:37.144 INFO TaskSetManager - Finished task 0.0 in stage 97.0 (TID 143) in 7 ms on localhost (executor driver) (2/2)
19:48:37.144 INFO TaskSchedulerImpl - Removed TaskSet 97.0, whose tasks have all completed, from pool
19:48:37.144 INFO DAGScheduler - ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
19:48:37.144 INFO DAGScheduler - Job 69 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.144 INFO TaskSchedulerImpl - Killing all running tasks in stage 97: Stage finished
19:48:37.144 INFO DAGScheduler - Job 69 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010932 s
19:48:37.157 INFO FileInputFormat - Total input files to process : 2
19:48:37.161 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:37.162 INFO DAGScheduler - Got job 70 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:37.162 INFO DAGScheduler - Final stage: ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222)
19:48:37.162 INFO DAGScheduler - Parents of final stage: List()
19:48:37.162 INFO DAGScheduler - Missing parents: List()
19:48:37.162 INFO DAGScheduler - Submitting ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:37.179 INFO MemoryStore - Block broadcast_176 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
19:48:37.180 INFO MemoryStore - Block broadcast_176_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
19:48:37.180 INFO BlockManagerInfo - Added broadcast_176_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:37.181 INFO SparkContext - Created broadcast 176 from broadcast at DAGScheduler.scala:1580
19:48:37.181 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.181 INFO TaskSchedulerImpl - Adding task set 98.0 with 2 tasks resource profile 0
19:48:37.181 INFO TaskSetManager - Starting task 0.0 in stage 98.0 (TID 145) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:37.182 INFO TaskSetManager - Starting task 1.0 in stage 98.0 (TID 146) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:37.182 INFO Executor - Running task 0.0 in stage 98.0 (TID 145)
19:48:37.182 INFO Executor - Running task 1.0 in stage 98.0 (TID 146)
19:48:37.214 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115207248422026731371.bam/part-r-00000.bam:0+132492
19:48:37.223 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115207248422026731371.bam/part-r-00001.bam:0+129330
19:48:37.228 INFO Executor - Finished task 1.0 in stage 98.0 (TID 146). 989 bytes result sent to driver
19:48:37.229 INFO TaskSetManager - Finished task 1.0 in stage 98.0 (TID 146) in 47 ms on localhost (executor driver) (1/2)
19:48:37.232 INFO Executor - Finished task 0.0 in stage 98.0 (TID 145). 989 bytes result sent to driver
19:48:37.233 INFO TaskSetManager - Finished task 0.0 in stage 98.0 (TID 145) in 52 ms on localhost (executor driver) (2/2)
19:48:37.233 INFO TaskSchedulerImpl - Removed TaskSet 98.0, whose tasks have all completed, from pool
19:48:37.233 INFO DAGScheduler - ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.071 s
19:48:37.233 INFO DAGScheduler - Job 70 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.233 INFO TaskSchedulerImpl - Killing all running tasks in stage 98: Stage finished
19:48:37.233 INFO DAGScheduler - Job 70 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.071815 s
19:48:37.237 INFO MemoryStore - Block broadcast_177 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
19:48:37.247 INFO MemoryStore - Block broadcast_177_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
19:48:37.247 INFO BlockManagerInfo - Added broadcast_177_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.5 MiB)
19:48:37.248 INFO SparkContext - Created broadcast 177 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.271 INFO MemoryStore - Block broadcast_178 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
19:48:37.277 INFO MemoryStore - Block broadcast_178_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.0 MiB)
19:48:37.277 INFO BlockManagerInfo - Added broadcast_178_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.4 MiB)
19:48:37.278 INFO SparkContext - Created broadcast 178 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.298 INFO MemoryStore - Block broadcast_179 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
19:48:37.299 INFO MemoryStore - Block broadcast_179_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
19:48:37.299 INFO BlockManagerInfo - Added broadcast_179_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:48:37.299 INFO SparkContext - Created broadcast 179 from broadcast at ReadsSparkSink.java:133
19:48:37.300 INFO MemoryStore - Block broadcast_180 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
19:48:37.301 INFO MemoryStore - Block broadcast_180_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
19:48:37.301 INFO BlockManagerInfo - Added broadcast_180_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:48:37.301 INFO SparkContext - Created broadcast 180 from broadcast at AnySamSinkMultiple.java:80
19:48:37.303 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.303 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.303 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.315 INFO FileInputFormat - Total input files to process : 1
19:48:37.321 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:37.321 INFO DAGScheduler - Registering RDD 431 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 21
19:48:37.321 INFO DAGScheduler - Got job 71 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:37.321 INFO DAGScheduler - Final stage: ResultStage 100 (runJob at SparkHadoopWriter.scala:83)
19:48:37.321 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 99)
19:48:37.322 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 99)
19:48:37.322 INFO DAGScheduler - Submitting ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:37.345 INFO MemoryStore - Block broadcast_181 stored as values in memory (estimated size 427.7 KiB, free 1916.2 MiB)
19:48:37.346 INFO MemoryStore - Block broadcast_181_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.1 MiB)
19:48:37.346 INFO BlockManagerInfo - Added broadcast_181_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.3 MiB)
19:48:37.346 INFO SparkContext - Created broadcast 181 from broadcast at DAGScheduler.scala:1580
19:48:37.346 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:37.347 INFO TaskSchedulerImpl - Adding task set 99.0 with 1 tasks resource profile 0
19:48:37.347 INFO TaskSetManager - Starting task 0.0 in stage 99.0 (TID 147) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
19:48:37.347 INFO Executor - Running task 0.0 in stage 99.0 (TID 147)
19:48:37.383 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:37.402 INFO Executor - Finished task 0.0 in stage 99.0 (TID 147). 1149 bytes result sent to driver
19:48:37.402 INFO TaskSetManager - Finished task 0.0 in stage 99.0 (TID 147) in 55 ms on localhost (executor driver) (1/1)
19:48:37.402 INFO TaskSchedulerImpl - Removed TaskSet 99.0, whose tasks have all completed, from pool
19:48:37.403 INFO DAGScheduler - ShuffleMapStage 99 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.081 s
19:48:37.403 INFO DAGScheduler - looking for newly runnable stages
19:48:37.403 INFO DAGScheduler - running: HashSet()
19:48:37.403 INFO DAGScheduler - waiting: HashSet(ResultStage 100)
19:48:37.403 INFO DAGScheduler - failed: HashSet()
19:48:37.403 INFO DAGScheduler - Submitting ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:37.414 INFO MemoryStore - Block broadcast_182 stored as values in memory (estimated size 150.2 KiB, free 1915.9 MiB)
19:48:37.415 INFO MemoryStore - Block broadcast_182_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1915.9 MiB)
19:48:37.415 INFO BlockManagerInfo - Added broadcast_182_piece0 in memory on localhost:36125 (size: 56.2 KiB, free: 1919.2 MiB)
19:48:37.415 INFO SparkContext - Created broadcast 182 from broadcast at DAGScheduler.scala:1580
19:48:37.415 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.415 INFO TaskSchedulerImpl - Adding task set 100.0 with 2 tasks resource profile 0
19:48:37.416 INFO TaskSetManager - Starting task 0.0 in stage 100.0 (TID 148) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:37.416 INFO TaskSetManager - Starting task 1.0 in stage 100.0 (TID 149) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:37.416 INFO Executor - Running task 0.0 in stage 100.0 (TID 148)
19:48:37.416 INFO Executor - Running task 1.0 in stage 100.0 (TID 149)
19:48:37.421 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.421 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.421 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.421 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.421 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.421 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.423 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.423 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.423 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.423 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.423 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.423 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.434 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.434 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.435 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.435 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.440 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948371951128895770137609_0443_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest211153268345737747596.bam/_temporary/0/task_202507151948371951128895770137609_0443_r_000001
19:48:37.440 INFO SparkHadoopMapRedUtil - attempt_202507151948371951128895770137609_0443_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:37.441 INFO Executor - Finished task 1.0 in stage 100.0 (TID 149). 1729 bytes result sent to driver
19:48:37.441 INFO TaskSetManager - Finished task 1.0 in stage 100.0 (TID 149) in 25 ms on localhost (executor driver) (1/2)
19:48:37.443 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948371951128895770137609_0443_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest211153268345737747596.bam/_temporary/0/task_202507151948371951128895770137609_0443_r_000000
19:48:37.443 INFO SparkHadoopMapRedUtil - attempt_202507151948371951128895770137609_0443_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:37.443 INFO Executor - Finished task 0.0 in stage 100.0 (TID 148). 1729 bytes result sent to driver
19:48:37.443 INFO TaskSetManager - Finished task 0.0 in stage 100.0 (TID 148) in 27 ms on localhost (executor driver) (2/2)
19:48:37.443 INFO TaskSchedulerImpl - Removed TaskSet 100.0, whose tasks have all completed, from pool
19:48:37.444 INFO DAGScheduler - ResultStage 100 (runJob at SparkHadoopWriter.scala:83) finished in 0.041 s
19:48:37.444 INFO DAGScheduler - Job 71 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.444 INFO TaskSchedulerImpl - Killing all running tasks in stage 100: Stage finished
19:48:37.444 INFO DAGScheduler - Job 71 finished: runJob at SparkHadoopWriter.scala:83, took 0.123090 s
19:48:37.444 INFO SparkHadoopWriter - Start to commit write Job job_202507151948371951128895770137609_0443.
19:48:37.450 INFO SparkHadoopWriter - Write Job job_202507151948371951128895770137609_0443 committed. Elapsed time: 5 ms.
19:48:37.452 INFO MemoryStore - Block broadcast_183 stored as values in memory (estimated size 297.9 KiB, free 1915.6 MiB)
19:48:37.461 INFO BlockManagerInfo - Removed broadcast_168_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:37.461 INFO BlockManagerInfo - Removed broadcast_181_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.4 MiB)
19:48:37.462 INFO BlockManagerInfo - Removed broadcast_180_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:37.463 INFO BlockManagerInfo - Removed broadcast_179_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:37.463 INFO BlockManagerInfo - Removed broadcast_170_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:37.464 INFO BlockManagerInfo - Removed broadcast_176_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:48:37.464 INFO BlockManagerInfo - Removed broadcast_182_piece0 on localhost:36125 in memory (size: 56.2 KiB, free: 1919.6 MiB)
19:48:37.465 INFO BlockManagerInfo - Removed broadcast_175_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.6 MiB)
19:48:37.466 INFO BlockManagerInfo - Removed broadcast_178_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.7 MiB)
19:48:37.466 INFO BlockManagerInfo - Removed broadcast_172_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.8 MiB)
19:48:37.467 INFO BlockManagerInfo - Removed broadcast_173_piece0 on localhost:36125 in memory (size: 56.3 KiB, free: 1919.9 MiB)
19:48:37.467 INFO BlockManagerInfo - Removed broadcast_174_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:37.468 INFO MemoryStore - Block broadcast_183_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:37.468 INFO BlockManagerInfo - Removed broadcast_171_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1920.0 MiB)
19:48:37.468 INFO BlockManagerInfo - Added broadcast_183_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:37.469 INFO SparkContext - Created broadcast 183 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.494 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:37.495 INFO DAGScheduler - Got job 72 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:37.495 INFO DAGScheduler - Final stage: ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222)
19:48:37.495 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 101)
19:48:37.495 INFO DAGScheduler - Missing parents: List()
19:48:37.495 INFO DAGScheduler - Submitting ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:37.496 INFO MemoryStore - Block broadcast_184 stored as values in memory (estimated size 6.3 KiB, free 1919.3 MiB)
19:48:37.496 INFO MemoryStore - Block broadcast_184_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1919.3 MiB)
19:48:37.496 INFO BlockManagerInfo - Added broadcast_184_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.9 MiB)
19:48:37.496 INFO SparkContext - Created broadcast 184 from broadcast at DAGScheduler.scala:1580
19:48:37.497 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.497 INFO TaskSchedulerImpl - Adding task set 102.0 with 2 tasks resource profile 0
19:48:37.497 INFO TaskSetManager - Starting task 0.0 in stage 102.0 (TID 150) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:37.498 INFO TaskSetManager - Starting task 1.0 in stage 102.0 (TID 151) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:37.498 INFO Executor - Running task 1.0 in stage 102.0 (TID 151)
19:48:37.498 INFO Executor - Running task 0.0 in stage 102.0 (TID 150)
19:48:37.500 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.500 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.500 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.500 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.504 INFO Executor - Finished task 1.0 in stage 102.0 (TID 151). 1591 bytes result sent to driver
19:48:37.504 INFO Executor - Finished task 0.0 in stage 102.0 (TID 150). 1591 bytes result sent to driver
19:48:37.504 INFO TaskSetManager - Finished task 0.0 in stage 102.0 (TID 150) in 7 ms on localhost (executor driver) (1/2)
19:48:37.505 INFO TaskSetManager - Finished task 1.0 in stage 102.0 (TID 151) in 8 ms on localhost (executor driver) (2/2)
19:48:37.505 INFO TaskSchedulerImpl - Removed TaskSet 102.0, whose tasks have all completed, from pool
19:48:37.505 INFO DAGScheduler - ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
19:48:37.505 INFO DAGScheduler - Job 72 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.505 INFO TaskSchedulerImpl - Killing all running tasks in stage 102: Stage finished
19:48:37.505 INFO DAGScheduler - Job 72 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010754 s
19:48:37.519 INFO FileInputFormat - Total input files to process : 2
19:48:37.522 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:37.522 INFO DAGScheduler - Got job 73 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:37.522 INFO DAGScheduler - Final stage: ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222)
19:48:37.522 INFO DAGScheduler - Parents of final stage: List()
19:48:37.522 INFO DAGScheduler - Missing parents: List()
19:48:37.523 INFO DAGScheduler - Submitting ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:37.540 INFO MemoryStore - Block broadcast_185 stored as values in memory (estimated size 426.1 KiB, free 1918.9 MiB)
19:48:37.541 INFO MemoryStore - Block broadcast_185_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.7 MiB)
19:48:37.541 INFO BlockManagerInfo - Added broadcast_185_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:37.542 INFO SparkContext - Created broadcast 185 from broadcast at DAGScheduler.scala:1580
19:48:37.542 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.542 INFO TaskSchedulerImpl - Adding task set 103.0 with 2 tasks resource profile 0
19:48:37.542 INFO TaskSetManager - Starting task 0.0 in stage 103.0 (TID 152) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:37.543 INFO TaskSetManager - Starting task 1.0 in stage 103.0 (TID 153) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:37.543 INFO Executor - Running task 0.0 in stage 103.0 (TID 152)
19:48:37.543 INFO Executor - Running task 1.0 in stage 103.0 (TID 153)
19:48:37.588 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest211153268345737747596.bam/part-r-00000.bam:0+129755
19:48:37.588 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest211153268345737747596.bam/part-r-00001.bam:0+129440
19:48:37.601 INFO Executor - Finished task 1.0 in stage 103.0 (TID 153). 989 bytes result sent to driver
19:48:37.601 INFO Executor - Finished task 0.0 in stage 103.0 (TID 152). 989 bytes result sent to driver
19:48:37.602 INFO TaskSetManager - Finished task 0.0 in stage 103.0 (TID 152) in 60 ms on localhost (executor driver) (1/2)
19:48:37.602 INFO TaskSetManager - Finished task 1.0 in stage 103.0 (TID 153) in 59 ms on localhost (executor driver) (2/2)
19:48:37.602 INFO TaskSchedulerImpl - Removed TaskSet 103.0, whose tasks have all completed, from pool
19:48:37.602 INFO DAGScheduler - ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.079 s
19:48:37.602 INFO DAGScheduler - Job 73 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.602 INFO TaskSchedulerImpl - Killing all running tasks in stage 103: Stage finished
19:48:37.603 INFO DAGScheduler - Job 73 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.080657 s
19:48:37.605 INFO MemoryStore - Block broadcast_186 stored as values in memory (estimated size 298.0 KiB, free 1918.5 MiB)
19:48:37.611 INFO MemoryStore - Block broadcast_186_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
19:48:37.612 INFO BlockManagerInfo - Added broadcast_186_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:37.612 INFO SparkContext - Created broadcast 186 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.634 INFO MemoryStore - Block broadcast_187 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
19:48:37.640 INFO MemoryStore - Block broadcast_187_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
19:48:37.641 INFO BlockManagerInfo - Added broadcast_187_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:37.641 INFO SparkContext - Created broadcast 187 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.660 INFO MemoryStore - Block broadcast_188 stored as values in memory (estimated size 19.6 KiB, free 1918.0 MiB)
19:48:37.661 INFO MemoryStore - Block broadcast_188_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1918.0 MiB)
19:48:37.661 INFO BlockManagerInfo - Added broadcast_188_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.6 MiB)
19:48:37.661 INFO SparkContext - Created broadcast 188 from broadcast at ReadsSparkSink.java:133
19:48:37.662 INFO MemoryStore - Block broadcast_189 stored as values in memory (estimated size 20.0 KiB, free 1918.0 MiB)
19:48:37.663 INFO MemoryStore - Block broadcast_189_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1918.0 MiB)
19:48:37.663 INFO BlockManagerInfo - Added broadcast_189_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.6 MiB)
19:48:37.663 INFO SparkContext - Created broadcast 189 from broadcast at AnySamSinkMultiple.java:80
19:48:37.665 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.665 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.665 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.677 INFO FileInputFormat - Total input files to process : 1
19:48:37.683 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:37.684 INFO DAGScheduler - Registering RDD 458 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 22
19:48:37.684 INFO DAGScheduler - Got job 74 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:37.684 INFO DAGScheduler - Final stage: ResultStage 105 (runJob at SparkHadoopWriter.scala:83)
19:48:37.684 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 104)
19:48:37.684 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 104)
19:48:37.684 INFO DAGScheduler - Submitting ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:37.713 INFO MemoryStore - Block broadcast_190 stored as values in memory (estimated size 427.7 KiB, free 1917.6 MiB)
19:48:37.714 INFO MemoryStore - Block broadcast_190_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.5 MiB)
19:48:37.715 INFO BlockManagerInfo - Added broadcast_190_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.5 MiB)
19:48:37.715 INFO SparkContext - Created broadcast 190 from broadcast at DAGScheduler.scala:1580
19:48:37.715 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:37.715 INFO TaskSchedulerImpl - Adding task set 104.0 with 1 tasks resource profile 0
19:48:37.716 INFO TaskSetManager - Starting task 0.0 in stage 104.0 (TID 154) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
19:48:37.716 INFO Executor - Running task 0.0 in stage 104.0 (TID 154)
19:48:37.745 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:37.759 INFO Executor - Finished task 0.0 in stage 104.0 (TID 154). 1149 bytes result sent to driver
19:48:37.759 INFO TaskSetManager - Finished task 0.0 in stage 104.0 (TID 154) in 44 ms on localhost (executor driver) (1/1)
19:48:37.759 INFO TaskSchedulerImpl - Removed TaskSet 104.0, whose tasks have all completed, from pool
19:48:37.760 INFO DAGScheduler - ShuffleMapStage 104 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.076 s
19:48:37.760 INFO DAGScheduler - looking for newly runnable stages
19:48:37.760 INFO DAGScheduler - running: HashSet()
19:48:37.760 INFO DAGScheduler - waiting: HashSet(ResultStage 105)
19:48:37.760 INFO DAGScheduler - failed: HashSet()
19:48:37.760 INFO DAGScheduler - Submitting ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:37.770 INFO MemoryStore - Block broadcast_191 stored as values in memory (estimated size 150.2 KiB, free 1917.3 MiB)
19:48:37.771 INFO MemoryStore - Block broadcast_191_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1917.3 MiB)
19:48:37.771 INFO BlockManagerInfo - Added broadcast_191_piece0 in memory on localhost:36125 (size: 56.2 KiB, free: 1919.4 MiB)
19:48:37.772 INFO SparkContext - Created broadcast 191 from broadcast at DAGScheduler.scala:1580
19:48:37.772 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.772 INFO TaskSchedulerImpl - Adding task set 105.0 with 2 tasks resource profile 0
19:48:37.773 INFO TaskSetManager - Starting task 0.0 in stage 105.0 (TID 155) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:37.773 INFO TaskSetManager - Starting task 1.0 in stage 105.0 (TID 156) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:37.773 INFO Executor - Running task 1.0 in stage 105.0 (TID 156)
19:48:37.773 INFO Executor - Running task 0.0 in stage 105.0 (TID 155)
19:48:37.778 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.778 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.778 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.778 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.778 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.778 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.779 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.779 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.779 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.779 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:37.779 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:37.779 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:37.789 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.789 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.793 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.793 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.795 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948372901091461469260798_0470_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest315521585592022892860.bam/_temporary/0/task_202507151948372901091461469260798_0470_r_000000
19:48:37.796 INFO SparkHadoopMapRedUtil - attempt_202507151948372901091461469260798_0470_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:37.796 INFO Executor - Finished task 0.0 in stage 105.0 (TID 155). 1729 bytes result sent to driver
19:48:37.796 INFO TaskSetManager - Finished task 0.0 in stage 105.0 (TID 155) in 24 ms on localhost (executor driver) (1/2)
19:48:37.800 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948372901091461469260798_0470_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest315521585592022892860.bam/_temporary/0/task_202507151948372901091461469260798_0470_r_000001
19:48:37.800 INFO SparkHadoopMapRedUtil - attempt_202507151948372901091461469260798_0470_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:37.801 INFO Executor - Finished task 1.0 in stage 105.0 (TID 156). 1729 bytes result sent to driver
19:48:37.801 INFO TaskSetManager - Finished task 1.0 in stage 105.0 (TID 156) in 28 ms on localhost (executor driver) (2/2)
19:48:37.801 INFO TaskSchedulerImpl - Removed TaskSet 105.0, whose tasks have all completed, from pool
19:48:37.801 INFO DAGScheduler - ResultStage 105 (runJob at SparkHadoopWriter.scala:83) finished in 0.041 s
19:48:37.801 INFO DAGScheduler - Job 74 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.801 INFO TaskSchedulerImpl - Killing all running tasks in stage 105: Stage finished
19:48:37.801 INFO DAGScheduler - Job 74 finished: runJob at SparkHadoopWriter.scala:83, took 0.117935 s
19:48:37.802 INFO SparkHadoopWriter - Start to commit write Job job_202507151948372901091461469260798_0470.
19:48:37.807 INFO SparkHadoopWriter - Write Job job_202507151948372901091461469260798_0470 committed. Elapsed time: 5 ms.
19:48:37.809 INFO MemoryStore - Block broadcast_192 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
19:48:37.818 INFO MemoryStore - Block broadcast_192_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
19:48:37.818 INFO BlockManagerInfo - Added broadcast_192_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:37.818 INFO SparkContext - Created broadcast 192 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.841 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:37.841 INFO DAGScheduler - Got job 75 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:37.841 INFO DAGScheduler - Final stage: ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222)
19:48:37.841 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 106)
19:48:37.841 INFO DAGScheduler - Missing parents: List()
19:48:37.842 INFO DAGScheduler - Submitting ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:37.842 INFO MemoryStore - Block broadcast_193 stored as values in memory (estimated size 6.3 KiB, free 1916.9 MiB)
19:48:37.843 INFO MemoryStore - Block broadcast_193_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.9 MiB)
19:48:37.843 INFO BlockManagerInfo - Added broadcast_193_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.4 MiB)
19:48:37.843 INFO SparkContext - Created broadcast 193 from broadcast at DAGScheduler.scala:1580
19:48:37.843 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.843 INFO TaskSchedulerImpl - Adding task set 107.0 with 2 tasks resource profile 0
19:48:37.844 INFO TaskSetManager - Starting task 0.0 in stage 107.0 (TID 157) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:37.844 INFO TaskSetManager - Starting task 1.0 in stage 107.0 (TID 158) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:37.845 INFO Executor - Running task 1.0 in stage 107.0 (TID 158)
19:48:37.845 INFO Executor - Running task 0.0 in stage 107.0 (TID 157)
19:48:37.846 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.846 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:37.846 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.846 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:37.851 INFO Executor - Finished task 1.0 in stage 107.0 (TID 158). 1634 bytes result sent to driver
19:48:37.851 INFO Executor - Finished task 0.0 in stage 107.0 (TID 157). 1591 bytes result sent to driver
19:48:37.851 INFO TaskSetManager - Finished task 1.0 in stage 107.0 (TID 158) in 7 ms on localhost (executor driver) (1/2)
19:48:37.851 INFO TaskSetManager - Finished task 0.0 in stage 107.0 (TID 157) in 7 ms on localhost (executor driver) (2/2)
19:48:37.851 INFO TaskSchedulerImpl - Removed TaskSet 107.0, whose tasks have all completed, from pool
19:48:37.851 INFO DAGScheduler - ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
19:48:37.851 INFO DAGScheduler - Job 75 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.851 INFO TaskSchedulerImpl - Killing all running tasks in stage 107: Stage finished
19:48:37.852 INFO DAGScheduler - Job 75 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010474 s
19:48:37.865 INFO FileInputFormat - Total input files to process : 2
19:48:37.868 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:37.868 INFO DAGScheduler - Got job 76 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:37.868 INFO DAGScheduler - Final stage: ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222)
19:48:37.868 INFO DAGScheduler - Parents of final stage: List()
19:48:37.868 INFO DAGScheduler - Missing parents: List()
19:48:37.869 INFO DAGScheduler - Submitting ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:37.886 INFO MemoryStore - Block broadcast_194 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
19:48:37.887 INFO MemoryStore - Block broadcast_194_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
19:48:37.888 INFO BlockManagerInfo - Added broadcast_194_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.2 MiB)
19:48:37.888 INFO SparkContext - Created broadcast 194 from broadcast at DAGScheduler.scala:1580
19:48:37.888 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:37.888 INFO TaskSchedulerImpl - Adding task set 108.0 with 2 tasks resource profile 0
19:48:37.889 INFO TaskSetManager - Starting task 0.0 in stage 108.0 (TID 159) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:37.889 INFO TaskSetManager - Starting task 1.0 in stage 108.0 (TID 160) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:37.889 INFO Executor - Running task 0.0 in stage 108.0 (TID 159)
19:48:37.889 INFO Executor - Running task 1.0 in stage 108.0 (TID 160)
19:48:37.919 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest315521585592022892860.bam/part-r-00001.bam:0+123314
19:48:37.922 INFO Executor - Finished task 0.0 in stage 108.0 (TID 159). 989 bytes result sent to driver
19:48:37.923 INFO TaskSetManager - Finished task 0.0 in stage 108.0 (TID 159) in 35 ms on localhost (executor driver) (1/2)
19:48:37.933 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest315521585592022892860.bam/part-r-00000.bam:0+122169
19:48:37.945 INFO Executor - Finished task 1.0 in stage 108.0 (TID 160). 1075 bytes result sent to driver
19:48:37.946 INFO BlockManagerInfo - Removed broadcast_177_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.3 MiB)
19:48:37.946 INFO TaskSetManager - Finished task 1.0 in stage 108.0 (TID 160) in 57 ms on localhost (executor driver) (2/2)
19:48:37.946 INFO TaskSchedulerImpl - Removed TaskSet 108.0, whose tasks have all completed, from pool
19:48:37.946 INFO DAGScheduler - ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.077 s
19:48:37.946 INFO DAGScheduler - Job 76 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:37.946 INFO TaskSchedulerImpl - Killing all running tasks in stage 108: Stage finished
19:48:37.946 INFO DAGScheduler - Job 76 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.077990 s
19:48:37.947 INFO BlockManagerInfo - Removed broadcast_183_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:37.948 INFO BlockManagerInfo - Removed broadcast_188_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.3 MiB)
19:48:37.948 INFO BlockManagerInfo - Removed broadcast_184_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.3 MiB)
19:48:37.949 INFO MemoryStore - Block broadcast_195 stored as values in memory (estimated size 576.0 B, free 1917.0 MiB)
19:48:37.950 INFO BlockManagerInfo - Removed broadcast_187_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:37.950 INFO MemoryStore - Block broadcast_195_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.4 MiB)
19:48:37.950 INFO BlockManagerInfo - Added broadcast_195_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.4 MiB)
19:48:37.950 INFO SparkContext - Created broadcast 195 from broadcast at CramSource.java:114
19:48:37.951 INFO MemoryStore - Block broadcast_196 stored as values in memory (estimated size 297.9 KiB, free 1917.1 MiB)
19:48:37.952 INFO BlockManagerInfo - Removed broadcast_185_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:37.952 INFO BlockManagerInfo - Removed broadcast_190_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.7 MiB)
19:48:37.954 INFO BlockManagerInfo - Removed broadcast_193_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.7 MiB)
19:48:37.955 INFO BlockManagerInfo - Removed broadcast_189_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.7 MiB)
19:48:37.955 INFO BlockManagerInfo - Removed broadcast_191_piece0 on localhost:36125 in memory (size: 56.2 KiB, free: 1919.8 MiB)
19:48:37.962 INFO MemoryStore - Block broadcast_196_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
19:48:37.962 INFO BlockManagerInfo - Added broadcast_196_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:37.962 INFO SparkContext - Created broadcast 196 from newAPIHadoopFile at PathSplitSource.java:96
19:48:37.981 INFO MemoryStore - Block broadcast_197 stored as values in memory (estimated size 576.0 B, free 1918.4 MiB)
19:48:37.981 INFO MemoryStore - Block broadcast_197_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.4 MiB)
19:48:37.981 INFO BlockManagerInfo - Added broadcast_197_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.7 MiB)
19:48:37.981 INFO SparkContext - Created broadcast 197 from broadcast at CramSource.java:114
19:48:37.983 INFO MemoryStore - Block broadcast_198 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:37.993 INFO MemoryStore - Block broadcast_198_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
19:48:37.993 INFO BlockManagerInfo - Added broadcast_198_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:37.993 INFO SparkContext - Created broadcast 198 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.007 INFO MemoryStore - Block broadcast_199 stored as values in memory (estimated size 6.0 KiB, free 1918.1 MiB)
19:48:38.007 INFO MemoryStore - Block broadcast_199_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1918.1 MiB)
19:48:38.007 INFO BlockManagerInfo - Added broadcast_199_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.7 MiB)
19:48:38.008 INFO SparkContext - Created broadcast 199 from broadcast at ReadsSparkSink.java:133
19:48:38.008 INFO MemoryStore - Block broadcast_200 stored as values in memory (estimated size 6.2 KiB, free 1918.1 MiB)
19:48:38.009 INFO MemoryStore - Block broadcast_200_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1918.1 MiB)
19:48:38.009 INFO BlockManagerInfo - Added broadcast_200_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.7 MiB)
19:48:38.009 INFO SparkContext - Created broadcast 200 from broadcast at AnySamSinkMultiple.java:80
19:48:38.011 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.011 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.011 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.023 INFO FileInputFormat - Total input files to process : 1
19:48:38.029 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:38.030 INFO DAGScheduler - Registering RDD 484 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 23
19:48:38.030 INFO DAGScheduler - Got job 77 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:38.030 INFO DAGScheduler - Final stage: ResultStage 110 (runJob at SparkHadoopWriter.scala:83)
19:48:38.030 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 109)
19:48:38.030 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 109)
19:48:38.030 INFO DAGScheduler - Submitting ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:38.045 INFO MemoryStore - Block broadcast_201 stored as values in memory (estimated size 288.4 KiB, free 1917.8 MiB)
19:48:38.047 INFO MemoryStore - Block broadcast_201_piece0 stored as bytes in memory (estimated size 104.7 KiB, free 1917.7 MiB)
19:48:38.047 INFO BlockManagerInfo - Added broadcast_201_piece0 in memory on localhost:36125 (size: 104.7 KiB, free: 1919.5 MiB)
19:48:38.047 INFO SparkContext - Created broadcast 201 from broadcast at DAGScheduler.scala:1580
19:48:38.047 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:38.047 INFO TaskSchedulerImpl - Adding task set 109.0 with 1 tasks resource profile 0
19:48:38.048 INFO TaskSetManager - Starting task 0.0 in stage 109.0 (TID 161) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
19:48:38.048 INFO Executor - Running task 0.0 in stage 109.0 (TID 161)
19:48:38.070 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:38.084 INFO Executor - Finished task 0.0 in stage 109.0 (TID 161). 1149 bytes result sent to driver
19:48:38.084 INFO TaskSetManager - Finished task 0.0 in stage 109.0 (TID 161) in 36 ms on localhost (executor driver) (1/1)
19:48:38.084 INFO TaskSchedulerImpl - Removed TaskSet 109.0, whose tasks have all completed, from pool
19:48:38.084 INFO DAGScheduler - ShuffleMapStage 109 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.053 s
19:48:38.084 INFO DAGScheduler - looking for newly runnable stages
19:48:38.084 INFO DAGScheduler - running: HashSet()
19:48:38.084 INFO DAGScheduler - waiting: HashSet(ResultStage 110)
19:48:38.084 INFO DAGScheduler - failed: HashSet()
19:48:38.085 INFO DAGScheduler - Submitting ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:38.091 INFO MemoryStore - Block broadcast_202 stored as values in memory (estimated size 150.3 KiB, free 1917.5 MiB)
19:48:38.092 INFO MemoryStore - Block broadcast_202_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1917.5 MiB)
19:48:38.092 INFO BlockManagerInfo - Added broadcast_202_piece0 in memory on localhost:36125 (size: 56.3 KiB, free: 1919.5 MiB)
19:48:38.092 INFO SparkContext - Created broadcast 202 from broadcast at DAGScheduler.scala:1580
19:48:38.092 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:38.092 INFO TaskSchedulerImpl - Adding task set 110.0 with 2 tasks resource profile 0
19:48:38.093 INFO TaskSetManager - Starting task 0.0 in stage 110.0 (TID 162) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:38.093 INFO TaskSetManager - Starting task 1.0 in stage 110.0 (TID 163) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:38.093 INFO Executor - Running task 1.0 in stage 110.0 (TID 163)
19:48:38.093 INFO Executor - Running task 0.0 in stage 110.0 (TID 162)
19:48:38.098 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.098 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.098 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.098 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.098 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.098 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.100 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.100 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.100 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.100 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.100 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.100 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.110 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.110 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.112 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.112 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.115 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948384956076694523021487_0495_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest52382753180202664940.cram/_temporary/0/task_202507151948384956076694523021487_0495_r_000001
19:48:38.115 INFO SparkHadoopMapRedUtil - attempt_202507151948384956076694523021487_0495_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:38.116 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948384956076694523021487_0495_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest52382753180202664940.cram/_temporary/0/task_202507151948384956076694523021487_0495_r_000000
19:48:38.116 INFO SparkHadoopMapRedUtil - attempt_202507151948384956076694523021487_0495_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:38.116 INFO Executor - Finished task 1.0 in stage 110.0 (TID 163). 1729 bytes result sent to driver
19:48:38.116 INFO Executor - Finished task 0.0 in stage 110.0 (TID 162). 1729 bytes result sent to driver
19:48:38.116 INFO TaskSetManager - Finished task 1.0 in stage 110.0 (TID 163) in 23 ms on localhost (executor driver) (1/2)
19:48:38.116 INFO TaskSetManager - Finished task 0.0 in stage 110.0 (TID 162) in 23 ms on localhost (executor driver) (2/2)
19:48:38.116 INFO TaskSchedulerImpl - Removed TaskSet 110.0, whose tasks have all completed, from pool
19:48:38.117 INFO DAGScheduler - ResultStage 110 (runJob at SparkHadoopWriter.scala:83) finished in 0.032 s
19:48:38.117 INFO DAGScheduler - Job 77 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.117 INFO TaskSchedulerImpl - Killing all running tasks in stage 110: Stage finished
19:48:38.117 INFO DAGScheduler - Job 77 finished: runJob at SparkHadoopWriter.scala:83, took 0.087359 s
19:48:38.117 INFO SparkHadoopWriter - Start to commit write Job job_202507151948384956076694523021487_0495.
19:48:38.122 INFO SparkHadoopWriter - Write Job job_202507151948384956076694523021487_0495 committed. Elapsed time: 4 ms.
19:48:38.124 INFO MemoryStore - Block broadcast_203 stored as values in memory (estimated size 297.9 KiB, free 1917.2 MiB)
19:48:38.130 INFO MemoryStore - Block broadcast_203_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.1 MiB)
19:48:38.130 INFO BlockManagerInfo - Added broadcast_203_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:38.130 INFO SparkContext - Created broadcast 203 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.153 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:38.153 INFO DAGScheduler - Got job 78 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:38.153 INFO DAGScheduler - Final stage: ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222)
19:48:38.153 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 111)
19:48:38.153 INFO DAGScheduler - Missing parents: List()
19:48:38.153 INFO DAGScheduler - Submitting ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:38.154 INFO MemoryStore - Block broadcast_204 stored as values in memory (estimated size 6.3 KiB, free 1917.1 MiB)
19:48:38.155 INFO MemoryStore - Block broadcast_204_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1917.1 MiB)
19:48:38.155 INFO BlockManagerInfo - Added broadcast_204_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.4 MiB)
19:48:38.155 INFO SparkContext - Created broadcast 204 from broadcast at DAGScheduler.scala:1580
19:48:38.155 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:38.155 INFO TaskSchedulerImpl - Adding task set 112.0 with 2 tasks resource profile 0
19:48:38.156 INFO TaskSetManager - Starting task 0.0 in stage 112.0 (TID 164) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:38.156 INFO TaskSetManager - Starting task 1.0 in stage 112.0 (TID 165) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:38.156 INFO Executor - Running task 1.0 in stage 112.0 (TID 165)
19:48:38.156 INFO Executor - Running task 0.0 in stage 112.0 (TID 164)
19:48:38.158 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.158 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.158 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.158 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.161 INFO Executor - Finished task 1.0 in stage 112.0 (TID 165). 1591 bytes result sent to driver
19:48:38.162 INFO Executor - Finished task 0.0 in stage 112.0 (TID 164). 1591 bytes result sent to driver
19:48:38.162 INFO TaskSetManager - Finished task 1.0 in stage 112.0 (TID 165) in 6 ms on localhost (executor driver) (1/2)
19:48:38.162 INFO TaskSetManager - Finished task 0.0 in stage 112.0 (TID 164) in 6 ms on localhost (executor driver) (2/2)
19:48:38.162 INFO TaskSchedulerImpl - Removed TaskSet 112.0, whose tasks have all completed, from pool
19:48:38.162 INFO DAGScheduler - ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.008 s
19:48:38.162 INFO DAGScheduler - Job 78 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.162 INFO TaskSchedulerImpl - Killing all running tasks in stage 112: Stage finished
19:48:38.162 INFO DAGScheduler - Job 78 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.009675 s
19:48:38.175 INFO FileInputFormat - Total input files to process : 2
19:48:38.178 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:38.179 INFO DAGScheduler - Got job 79 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:38.179 INFO DAGScheduler - Final stage: ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222)
19:48:38.179 INFO DAGScheduler - Parents of final stage: List()
19:48:38.179 INFO DAGScheduler - Missing parents: List()
19:48:38.179 INFO DAGScheduler - Submitting ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:38.196 INFO MemoryStore - Block broadcast_205 stored as values in memory (estimated size 426.1 KiB, free 1916.7 MiB)
19:48:38.198 INFO MemoryStore - Block broadcast_205_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.6 MiB)
19:48:38.198 INFO BlockManagerInfo - Added broadcast_205_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.3 MiB)
19:48:38.198 INFO SparkContext - Created broadcast 205 from broadcast at DAGScheduler.scala:1580
19:48:38.198 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:38.198 INFO TaskSchedulerImpl - Adding task set 113.0 with 2 tasks resource profile 0
19:48:38.199 INFO TaskSetManager - Starting task 0.0 in stage 113.0 (TID 166) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:38.199 INFO TaskSetManager - Starting task 1.0 in stage 113.0 (TID 167) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:38.199 INFO Executor - Running task 1.0 in stage 113.0 (TID 167)
19:48:38.199 INFO Executor - Running task 0.0 in stage 113.0 (TID 166)
19:48:38.229 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest52382753180202664940.cram/part-r-00001.bam:0+30825
19:48:38.230 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest52382753180202664940.cram/part-r-00000.bam:0+31473
19:48:38.232 INFO Executor - Finished task 0.0 in stage 113.0 (TID 166). 989 bytes result sent to driver
19:48:38.232 INFO Executor - Finished task 1.0 in stage 113.0 (TID 167). 989 bytes result sent to driver
19:48:38.233 INFO TaskSetManager - Finished task 0.0 in stage 113.0 (TID 166) in 34 ms on localhost (executor driver) (1/2)
19:48:38.233 INFO TaskSetManager - Finished task 1.0 in stage 113.0 (TID 167) in 34 ms on localhost (executor driver) (2/2)
19:48:38.233 INFO TaskSchedulerImpl - Removed TaskSet 113.0, whose tasks have all completed, from pool
19:48:38.233 INFO DAGScheduler - ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.054 s
19:48:38.233 INFO DAGScheduler - Job 79 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.233 INFO TaskSchedulerImpl - Killing all running tasks in stage 113: Stage finished
19:48:38.233 INFO DAGScheduler - Job 79 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.054994 s
19:48:38.237 INFO MemoryStore - Block broadcast_206 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
19:48:38.243 INFO MemoryStore - Block broadcast_206_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.2 MiB)
19:48:38.243 INFO BlockManagerInfo - Added broadcast_206_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:38.243 INFO SparkContext - Created broadcast 206 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.266 INFO MemoryStore - Block broadcast_207 stored as values in memory (estimated size 297.9 KiB, free 1915.9 MiB)
19:48:38.274 INFO BlockManagerInfo - Removed broadcast_202_piece0 on localhost:36125 in memory (size: 56.3 KiB, free: 1919.3 MiB)
19:48:38.275 INFO BlockManagerInfo - Removed broadcast_194_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.4 MiB)
19:48:38.275 INFO BlockManagerInfo - Removed broadcast_196_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:38.276 INFO BlockManagerInfo - Removed broadcast_192_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:38.276 INFO BlockManagerInfo - Removed broadcast_195_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.5 MiB)
19:48:38.277 INFO BlockManagerInfo - Removed broadcast_203_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:38.277 INFO BlockManagerInfo - Removed broadcast_186_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:38.278 INFO BlockManagerInfo - Removed broadcast_204_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.6 MiB)
19:48:38.279 INFO BlockManagerInfo - Removed broadcast_198_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:38.279 INFO BlockManagerInfo - Removed broadcast_199_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.7 MiB)
19:48:38.280 INFO BlockManagerInfo - Removed broadcast_200_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.7 MiB)
19:48:38.280 INFO BlockManagerInfo - Removed broadcast_201_piece0 on localhost:36125 in memory (size: 104.7 KiB, free: 1919.8 MiB)
19:48:38.280 INFO BlockManagerInfo - Removed broadcast_205_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1920.0 MiB)
19:48:38.281 INFO BlockManagerInfo - Removed broadcast_197_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1920.0 MiB)
19:48:38.281 INFO MemoryStore - Block broadcast_207_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:38.281 INFO BlockManagerInfo - Added broadcast_207_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:38.282 INFO SparkContext - Created broadcast 207 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.310 INFO MemoryStore - Block broadcast_208 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
19:48:38.311 INFO MemoryStore - Block broadcast_208_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
19:48:38.311 INFO BlockManagerInfo - Added broadcast_208_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.9 MiB)
19:48:38.311 INFO SparkContext - Created broadcast 208 from broadcast at ReadsSparkSink.java:133
19:48:38.313 INFO MemoryStore - Block broadcast_209 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
19:48:38.313 INFO MemoryStore - Block broadcast_209_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
19:48:38.313 INFO BlockManagerInfo - Added broadcast_209_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.9 MiB)
19:48:38.314 INFO SparkContext - Created broadcast 209 from broadcast at AnySamSinkMultiple.java:80
19:48:38.315 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.315 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.315 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.327 INFO FileInputFormat - Total input files to process : 1
19:48:38.333 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:38.334 INFO DAGScheduler - Registering RDD 510 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 24
19:48:38.334 INFO DAGScheduler - Got job 80 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
19:48:38.334 INFO DAGScheduler - Final stage: ResultStage 115 (runJob at SparkHadoopWriter.scala:83)
19:48:38.334 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 114)
19:48:38.334 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 114)
19:48:38.334 INFO DAGScheduler - Submitting ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:38.358 INFO MemoryStore - Block broadcast_210 stored as values in memory (estimated size 427.7 KiB, free 1918.6 MiB)
19:48:38.360 INFO MemoryStore - Block broadcast_210_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.4 MiB)
19:48:38.360 INFO BlockManagerInfo - Added broadcast_210_piece0 in memory on localhost:36125 (size: 154.6 KiB, free: 1919.7 MiB)
19:48:38.360 INFO SparkContext - Created broadcast 210 from broadcast at DAGScheduler.scala:1580
19:48:38.360 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
19:48:38.360 INFO TaskSchedulerImpl - Adding task set 114.0 with 1 tasks resource profile 0
19:48:38.361 INFO TaskSetManager - Starting task 0.0 in stage 114.0 (TID 168) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:38.361 INFO Executor - Running task 0.0 in stage 114.0 (TID 168)
19:48:38.391 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:38.407 INFO Executor - Finished task 0.0 in stage 114.0 (TID 168). 1149 bytes result sent to driver
19:48:38.408 INFO TaskSetManager - Finished task 0.0 in stage 114.0 (TID 168) in 47 ms on localhost (executor driver) (1/1)
19:48:38.408 INFO TaskSchedulerImpl - Removed TaskSet 114.0, whose tasks have all completed, from pool
19:48:38.408 INFO DAGScheduler - ShuffleMapStage 114 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.074 s
19:48:38.408 INFO DAGScheduler - looking for newly runnable stages
19:48:38.408 INFO DAGScheduler - running: HashSet()
19:48:38.408 INFO DAGScheduler - waiting: HashSet(ResultStage 115)
19:48:38.408 INFO DAGScheduler - failed: HashSet()
19:48:38.408 INFO DAGScheduler - Submitting ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
19:48:38.415 INFO MemoryStore - Block broadcast_211 stored as values in memory (estimated size 150.2 KiB, free 1918.3 MiB)
19:48:38.415 INFO MemoryStore - Block broadcast_211_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1918.2 MiB)
19:48:38.416 INFO BlockManagerInfo - Added broadcast_211_piece0 in memory on localhost:36125 (size: 56.2 KiB, free: 1919.7 MiB)
19:48:38.416 INFO SparkContext - Created broadcast 211 from broadcast at DAGScheduler.scala:1580
19:48:38.416 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
19:48:38.416 INFO TaskSchedulerImpl - Adding task set 115.0 with 2 tasks resource profile 0
19:48:38.416 INFO TaskSetManager - Starting task 0.0 in stage 115.0 (TID 169) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:38.417 INFO TaskSetManager - Starting task 1.0 in stage 115.0 (TID 170) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:38.417 INFO Executor - Running task 0.0 in stage 115.0 (TID 169)
19:48:38.417 INFO Executor - Running task 1.0 in stage 115.0 (TID 170)
19:48:38.422 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.422 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.422 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.422 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.422 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.422 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.423 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.423 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.423 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.423 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.423 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.423 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.432 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.432 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.437 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.437 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.439 INFO FileOutputCommitter - Saved output of task 'attempt_2025071519483859401010510928459_0522_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest614129085813003995843.sam/_temporary/0/task_2025071519483859401010510928459_0522_r_000001
19:48:38.439 INFO SparkHadoopMapRedUtil - attempt_2025071519483859401010510928459_0522_r_000001_0: Committed. Elapsed time: 0 ms.
19:48:38.440 INFO Executor - Finished task 1.0 in stage 115.0 (TID 170). 1729 bytes result sent to driver
19:48:38.440 INFO TaskSetManager - Finished task 1.0 in stage 115.0 (TID 170) in 24 ms on localhost (executor driver) (1/2)
19:48:38.443 INFO FileOutputCommitter - Saved output of task 'attempt_2025071519483859401010510928459_0522_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest614129085813003995843.sam/_temporary/0/task_2025071519483859401010510928459_0522_r_000000
19:48:38.443 INFO SparkHadoopMapRedUtil - attempt_2025071519483859401010510928459_0522_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:38.444 INFO Executor - Finished task 0.0 in stage 115.0 (TID 169). 1729 bytes result sent to driver
19:48:38.444 INFO TaskSetManager - Finished task 0.0 in stage 115.0 (TID 169) in 28 ms on localhost (executor driver) (2/2)
19:48:38.444 INFO TaskSchedulerImpl - Removed TaskSet 115.0, whose tasks have all completed, from pool
19:48:38.444 INFO DAGScheduler - ResultStage 115 (runJob at SparkHadoopWriter.scala:83) finished in 0.035 s
19:48:38.444 INFO DAGScheduler - Job 80 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.444 INFO TaskSchedulerImpl - Killing all running tasks in stage 115: Stage finished
19:48:38.444 INFO DAGScheduler - Job 80 finished: runJob at SparkHadoopWriter.scala:83, took 0.111014 s
19:48:38.445 INFO SparkHadoopWriter - Start to commit write Job job_2025071519483859401010510928459_0522.
19:48:38.451 INFO SparkHadoopWriter - Write Job job_2025071519483859401010510928459_0522 committed. Elapsed time: 5 ms.
19:48:38.453 INFO MemoryStore - Block broadcast_212 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
19:48:38.459 INFO MemoryStore - Block broadcast_212_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
19:48:38.459 INFO BlockManagerInfo - Added broadcast_212_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:38.459 INFO SparkContext - Created broadcast 212 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.483 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:38.484 INFO DAGScheduler - Got job 81 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:38.484 INFO DAGScheduler - Final stage: ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222)
19:48:38.484 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 116)
19:48:38.484 INFO DAGScheduler - Missing parents: List()
19:48:38.484 INFO DAGScheduler - Submitting ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
19:48:38.485 INFO MemoryStore - Block broadcast_213 stored as values in memory (estimated size 6.3 KiB, free 1917.9 MiB)
19:48:38.485 INFO MemoryStore - Block broadcast_213_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1917.9 MiB)
19:48:38.485 INFO BlockManagerInfo - Added broadcast_213_piece0 in memory on localhost:36125 (size: 3.4 KiB, free: 1919.6 MiB)
19:48:38.485 INFO SparkContext - Created broadcast 213 from broadcast at DAGScheduler.scala:1580
19:48:38.485 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
19:48:38.485 INFO TaskSchedulerImpl - Adding task set 117.0 with 2 tasks resource profile 0
19:48:38.486 INFO TaskSetManager - Starting task 0.0 in stage 117.0 (TID 171) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
19:48:38.486 INFO TaskSetManager - Starting task 1.0 in stage 117.0 (TID 172) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
19:48:38.487 INFO Executor - Running task 1.0 in stage 117.0 (TID 172)
19:48:38.487 INFO Executor - Running task 0.0 in stage 117.0 (TID 171)
19:48:38.488 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.488 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.488 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.488 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.492 INFO Executor - Finished task 0.0 in stage 117.0 (TID 171). 1591 bytes result sent to driver
19:48:38.492 INFO Executor - Finished task 1.0 in stage 117.0 (TID 172). 1591 bytes result sent to driver
19:48:38.493 INFO TaskSetManager - Finished task 1.0 in stage 117.0 (TID 172) in 7 ms on localhost (executor driver) (1/2)
19:48:38.493 INFO TaskSetManager - Finished task 0.0 in stage 117.0 (TID 171) in 7 ms on localhost (executor driver) (2/2)
19:48:38.493 INFO TaskSchedulerImpl - Removed TaskSet 117.0, whose tasks have all completed, from pool
19:48:38.493 INFO DAGScheduler - ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
19:48:38.493 INFO DAGScheduler - Job 81 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.493 INFO TaskSchedulerImpl - Killing all running tasks in stage 117: Stage finished
19:48:38.493 INFO DAGScheduler - Job 81 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010034 s
19:48:38.507 INFO FileInputFormat - Total input files to process : 2
19:48:38.510 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
19:48:38.511 INFO DAGScheduler - Got job 82 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
19:48:38.511 INFO DAGScheduler - Final stage: ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222)
19:48:38.511 INFO DAGScheduler - Parents of final stage: List()
19:48:38.511 INFO DAGScheduler - Missing parents: List()
19:48:38.511 INFO DAGScheduler - Submitting ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:38.527 INFO MemoryStore - Block broadcast_214 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:38.529 INFO MemoryStore - Block broadcast_214_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.3 MiB)
19:48:38.529 INFO BlockManagerInfo - Added broadcast_214_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:38.529 INFO SparkContext - Created broadcast 214 from broadcast at DAGScheduler.scala:1580
19:48:38.529 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
19:48:38.529 INFO TaskSchedulerImpl - Adding task set 118.0 with 2 tasks resource profile 0
19:48:38.530 INFO TaskSetManager - Starting task 0.0 in stage 118.0 (TID 173) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
19:48:38.530 INFO TaskSetManager - Starting task 1.0 in stage 118.0 (TID 174) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
19:48:38.530 INFO Executor - Running task 0.0 in stage 118.0 (TID 173)
19:48:38.530 INFO Executor - Running task 1.0 in stage 118.0 (TID 174)
19:48:38.567 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest614129085813003995843.sam/part-r-00001.bam:0+129330
19:48:38.576 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest614129085813003995843.sam/part-r-00000.bam:0+132492
19:48:38.576 INFO Executor - Finished task 0.0 in stage 118.0 (TID 173). 989 bytes result sent to driver
19:48:38.577 INFO TaskSetManager - Finished task 0.0 in stage 118.0 (TID 173) in 47 ms on localhost (executor driver) (1/2)
19:48:38.589 INFO Executor - Finished task 1.0 in stage 118.0 (TID 174). 989 bytes result sent to driver
19:48:38.589 INFO TaskSetManager - Finished task 1.0 in stage 118.0 (TID 174) in 59 ms on localhost (executor driver) (2/2)
19:48:38.589 INFO TaskSchedulerImpl - Removed TaskSet 118.0, whose tasks have all completed, from pool
19:48:38.589 INFO DAGScheduler - ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.078 s
19:48:38.589 INFO DAGScheduler - Job 82 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.589 INFO TaskSchedulerImpl - Killing all running tasks in stage 118: Stage finished
19:48:38.589 INFO DAGScheduler - Job 82 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.079260 s
19:48:38.593 INFO MemoryStore - Block broadcast_215 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
19:48:38.600 INFO MemoryStore - Block broadcast_215_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
19:48:38.600 INFO BlockManagerInfo - Added broadcast_215_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:38.600 INFO SparkContext - Created broadcast 215 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.621 INFO MemoryStore - Block broadcast_216 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
19:48:38.627 INFO MemoryStore - Block broadcast_216_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
19:48:38.628 INFO BlockManagerInfo - Added broadcast_216_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:38.628 INFO SparkContext - Created broadcast 216 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.648 INFO FileInputFormat - Total input files to process : 1
19:48:38.650 INFO MemoryStore - Block broadcast_217 stored as values in memory (estimated size 160.7 KiB, free 1916.5 MiB)
19:48:38.651 INFO MemoryStore - Block broadcast_217_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.5 MiB)
19:48:38.651 INFO BlockManagerInfo - Added broadcast_217_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:48:38.651 INFO SparkContext - Created broadcast 217 from broadcast at ReadsSparkSink.java:133
19:48:38.653 INFO MemoryStore - Block broadcast_218 stored as values in memory (estimated size 163.2 KiB, free 1916.3 MiB)
19:48:38.653 INFO MemoryStore - Block broadcast_218_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.3 MiB)
19:48:38.653 INFO BlockManagerInfo - Added broadcast_218_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:48:38.654 INFO SparkContext - Created broadcast 218 from broadcast at BamSink.java:76
19:48:38.655 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.655 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.655 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.673 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:38.673 INFO DAGScheduler - Registering RDD 543 (mapToPair at SparkUtils.java:161) as input to shuffle 25
19:48:38.673 INFO DAGScheduler - Got job 83 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:38.674 INFO DAGScheduler - Final stage: ResultStage 120 (runJob at SparkHadoopWriter.scala:83)
19:48:38.674 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 119)
19:48:38.674 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 119)
19:48:38.674 INFO DAGScheduler - Submitting ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:38.691 INFO MemoryStore - Block broadcast_219 stored as values in memory (estimated size 520.4 KiB, free 1915.8 MiB)
19:48:38.692 INFO MemoryStore - Block broadcast_219_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.6 MiB)
19:48:38.692 INFO BlockManagerInfo - Added broadcast_219_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:38.693 INFO SparkContext - Created broadcast 219 from broadcast at DAGScheduler.scala:1580
19:48:38.693 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:38.693 INFO TaskSchedulerImpl - Adding task set 119.0 with 1 tasks resource profile 0
19:48:38.694 INFO TaskSetManager - Starting task 0.0 in stage 119.0 (TID 175) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:38.694 INFO Executor - Running task 0.0 in stage 119.0 (TID 175)
19:48:38.725 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:38.749 INFO Executor - Finished task 0.0 in stage 119.0 (TID 175). 1234 bytes result sent to driver
19:48:38.750 INFO BlockManagerInfo - Removed broadcast_216_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:38.750 INFO TaskSetManager - Finished task 0.0 in stage 119.0 (TID 175) in 57 ms on localhost (executor driver) (1/1)
19:48:38.750 INFO TaskSchedulerImpl - Removed TaskSet 119.0, whose tasks have all completed, from pool
19:48:38.750 INFO DAGScheduler - ShuffleMapStage 119 (mapToPair at SparkUtils.java:161) finished in 0.076 s
19:48:38.750 INFO DAGScheduler - looking for newly runnable stages
19:48:38.750 INFO DAGScheduler - running: HashSet()
19:48:38.750 INFO DAGScheduler - waiting: HashSet(ResultStage 120)
19:48:38.750 INFO DAGScheduler - failed: HashSet()
19:48:38.750 INFO DAGScheduler - Submitting ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91), which has no missing parents
19:48:38.751 INFO BlockManagerInfo - Removed broadcast_210_piece0 on localhost:36125 in memory (size: 154.6 KiB, free: 1919.4 MiB)
19:48:38.752 INFO BlockManagerInfo - Removed broadcast_213_piece0 on localhost:36125 in memory (size: 3.4 KiB, free: 1919.4 MiB)
19:48:38.752 INFO BlockManagerInfo - Removed broadcast_208_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:38.754 INFO BlockManagerInfo - Removed broadcast_206_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:38.754 INFO BlockManagerInfo - Removed broadcast_212_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:38.755 INFO BlockManagerInfo - Removed broadcast_214_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.7 MiB)
19:48:38.755 INFO BlockManagerInfo - Removed broadcast_211_piece0 on localhost:36125 in memory (size: 56.2 KiB, free: 1919.7 MiB)
19:48:38.756 INFO BlockManagerInfo - Removed broadcast_207_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:38.756 INFO BlockManagerInfo - Removed broadcast_209_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.8 MiB)
19:48:38.760 INFO MemoryStore - Block broadcast_220 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
19:48:38.761 INFO MemoryStore - Block broadcast_220_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
19:48:38.761 INFO BlockManagerInfo - Added broadcast_220_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.7 MiB)
19:48:38.761 INFO SparkContext - Created broadcast 220 from broadcast at DAGScheduler.scala:1580
19:48:38.761 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:38.761 INFO TaskSchedulerImpl - Adding task set 120.0 with 1 tasks resource profile 0
19:48:38.762 INFO TaskSetManager - Starting task 0.0 in stage 120.0 (TID 176) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:38.762 INFO Executor - Running task 0.0 in stage 120.0 (TID 176)
19:48:38.766 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:38.766 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:38.778 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.778 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.778 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.778 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:38.779 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:38.779 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:38.804 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948388478450232232335426_0548_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest17029703665555237546.bam.parts/_temporary/0/task_202507151948388478450232232335426_0548_r_000000
19:48:38.804 INFO SparkHadoopMapRedUtil - attempt_202507151948388478450232232335426_0548_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:38.805 INFO Executor - Finished task 0.0 in stage 120.0 (TID 176). 1858 bytes result sent to driver
19:48:38.805 INFO TaskSetManager - Finished task 0.0 in stage 120.0 (TID 176) in 43 ms on localhost (executor driver) (1/1)
19:48:38.805 INFO TaskSchedulerImpl - Removed TaskSet 120.0, whose tasks have all completed, from pool
19:48:38.805 INFO DAGScheduler - ResultStage 120 (runJob at SparkHadoopWriter.scala:83) finished in 0.054 s
19:48:38.806 INFO DAGScheduler - Job 83 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.806 INFO TaskSchedulerImpl - Killing all running tasks in stage 120: Stage finished
19:48:38.806 INFO DAGScheduler - Job 83 finished: runJob at SparkHadoopWriter.scala:83, took 0.132913 s
19:48:38.806 INFO SparkHadoopWriter - Start to commit write Job job_202507151948388478450232232335426_0548.
19:48:38.811 INFO SparkHadoopWriter - Write Job job_202507151948388478450232232335426_0548 committed. Elapsed time: 5 ms.
19:48:38.823 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest17029703665555237546.bam
19:48:38.827 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest17029703665555237546.bam done
19:48:38.827 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest17029703665555237546.bam.parts/ to /tmp/ReadsSparkSinkUnitTest17029703665555237546.bam.sbi
19:48:38.832 INFO IndexFileMerger - Done merging .sbi files
19:48:38.832 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest17029703665555237546.bam.parts/ to /tmp/ReadsSparkSinkUnitTest17029703665555237546.bam.bai
19:48:38.837 INFO IndexFileMerger - Done merging .bai files
19:48:38.839 INFO MemoryStore - Block broadcast_221 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
19:48:38.840 INFO MemoryStore - Block broadcast_221_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
19:48:38.840 INFO BlockManagerInfo - Added broadcast_221_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:48:38.840 INFO SparkContext - Created broadcast 221 from broadcast at BamSource.java:104
19:48:38.841 INFO MemoryStore - Block broadcast_222 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:38.847 INFO MemoryStore - Block broadcast_222_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:38.848 INFO BlockManagerInfo - Added broadcast_222_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:38.848 INFO SparkContext - Created broadcast 222 from newAPIHadoopFile at PathSplitSource.java:96
19:48:38.856 INFO FileInputFormat - Total input files to process : 1
19:48:38.870 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:38.871 INFO DAGScheduler - Got job 84 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:38.871 INFO DAGScheduler - Final stage: ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:38.871 INFO DAGScheduler - Parents of final stage: List()
19:48:38.871 INFO DAGScheduler - Missing parents: List()
19:48:38.871 INFO DAGScheduler - Submitting ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:38.877 INFO MemoryStore - Block broadcast_223 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
19:48:38.878 INFO MemoryStore - Block broadcast_223_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
19:48:38.878 INFO BlockManagerInfo - Added broadcast_223_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:38.878 INFO SparkContext - Created broadcast 223 from broadcast at DAGScheduler.scala:1580
19:48:38.878 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:38.878 INFO TaskSchedulerImpl - Adding task set 121.0 with 1 tasks resource profile 0
19:48:38.879 INFO TaskSetManager - Starting task 0.0 in stage 121.0 (TID 177) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:38.879 INFO Executor - Running task 0.0 in stage 121.0 (TID 177)
19:48:38.890 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17029703665555237546.bam:0+237038
19:48:38.895 INFO Executor - Finished task 0.0 in stage 121.0 (TID 177). 651483 bytes result sent to driver
19:48:38.896 INFO TaskSetManager - Finished task 0.0 in stage 121.0 (TID 177) in 18 ms on localhost (executor driver) (1/1)
19:48:38.896 INFO TaskSchedulerImpl - Removed TaskSet 121.0, whose tasks have all completed, from pool
19:48:38.896 INFO DAGScheduler - ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
19:48:38.896 INFO DAGScheduler - Job 84 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.896 INFO TaskSchedulerImpl - Killing all running tasks in stage 121: Stage finished
19:48:38.897 INFO DAGScheduler - Job 84 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026157 s
19:48:38.906 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:38.906 INFO DAGScheduler - Got job 85 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:38.906 INFO DAGScheduler - Final stage: ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185)
19:48:38.906 INFO DAGScheduler - Parents of final stage: List()
19:48:38.906 INFO DAGScheduler - Missing parents: List()
19:48:38.907 INFO DAGScheduler - Submitting ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:38.923 INFO MemoryStore - Block broadcast_224 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:38.924 INFO MemoryStore - Block broadcast_224_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:48:38.925 INFO BlockManagerInfo - Added broadcast_224_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:38.925 INFO SparkContext - Created broadcast 224 from broadcast at DAGScheduler.scala:1580
19:48:38.925 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:38.925 INFO TaskSchedulerImpl - Adding task set 122.0 with 1 tasks resource profile 0
19:48:38.926 INFO TaskSetManager - Starting task 0.0 in stage 122.0 (TID 178) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:38.926 INFO Executor - Running task 0.0 in stage 122.0 (TID 178)
19:48:38.955 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:38.965 INFO Executor - Finished task 0.0 in stage 122.0 (TID 178). 989 bytes result sent to driver
19:48:38.965 INFO TaskSetManager - Finished task 0.0 in stage 122.0 (TID 178) in 40 ms on localhost (executor driver) (1/1)
19:48:38.965 INFO TaskSchedulerImpl - Removed TaskSet 122.0, whose tasks have all completed, from pool
19:48:38.965 INFO DAGScheduler - ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:48:38.966 INFO DAGScheduler - Job 85 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.966 INFO TaskSchedulerImpl - Killing all running tasks in stage 122: Stage finished
19:48:38.966 INFO DAGScheduler - Job 85 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059533 s
19:48:38.969 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:38.969 INFO DAGScheduler - Got job 86 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:38.969 INFO DAGScheduler - Final stage: ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185)
19:48:38.969 INFO DAGScheduler - Parents of final stage: List()
19:48:38.969 INFO DAGScheduler - Missing parents: List()
19:48:38.970 INFO DAGScheduler - Submitting ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:38.976 INFO MemoryStore - Block broadcast_225 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:48:38.976 INFO MemoryStore - Block broadcast_225_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
19:48:38.976 INFO BlockManagerInfo - Added broadcast_225_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.4 MiB)
19:48:38.977 INFO SparkContext - Created broadcast 225 from broadcast at DAGScheduler.scala:1580
19:48:38.977 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:38.977 INFO TaskSchedulerImpl - Adding task set 123.0 with 1 tasks resource profile 0
19:48:38.977 INFO TaskSetManager - Starting task 0.0 in stage 123.0 (TID 179) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:38.977 INFO Executor - Running task 0.0 in stage 123.0 (TID 179)
19:48:38.992 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17029703665555237546.bam:0+237038
19:48:38.996 INFO Executor - Finished task 0.0 in stage 123.0 (TID 179). 989 bytes result sent to driver
19:48:38.996 INFO TaskSetManager - Finished task 0.0 in stage 123.0 (TID 179) in 19 ms on localhost (executor driver) (1/1)
19:48:38.996 INFO TaskSchedulerImpl - Removed TaskSet 123.0, whose tasks have all completed, from pool
19:48:38.996 INFO DAGScheduler - ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
19:48:38.996 INFO DAGScheduler - Job 86 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:38.996 INFO TaskSchedulerImpl - Killing all running tasks in stage 123: Stage finished
19:48:38.996 INFO DAGScheduler - Job 86 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027407 s
19:48:38.999 INFO MemoryStore - Block broadcast_226 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
19:48:39.005 INFO MemoryStore - Block broadcast_226_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:48:39.005 INFO BlockManagerInfo - Added broadcast_226_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:39.005 INFO SparkContext - Created broadcast 226 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.027 INFO MemoryStore - Block broadcast_227 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:39.033 INFO MemoryStore - Block broadcast_227_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:39.033 INFO BlockManagerInfo - Added broadcast_227_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:39.033 INFO SparkContext - Created broadcast 227 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.053 INFO FileInputFormat - Total input files to process : 1
19:48:39.055 INFO MemoryStore - Block broadcast_228 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
19:48:39.056 INFO MemoryStore - Block broadcast_228_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
19:48:39.056 INFO BlockManagerInfo - Added broadcast_228_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:39.056 INFO SparkContext - Created broadcast 228 from broadcast at ReadsSparkSink.java:133
19:48:39.057 INFO MemoryStore - Block broadcast_229 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
19:48:39.058 INFO MemoryStore - Block broadcast_229_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
19:48:39.058 INFO BlockManagerInfo - Added broadcast_229_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:39.058 INFO SparkContext - Created broadcast 229 from broadcast at BamSink.java:76
19:48:39.060 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:39.060 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:39.060 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:39.077 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:39.077 INFO DAGScheduler - Registering RDD 568 (mapToPair at SparkUtils.java:161) as input to shuffle 26
19:48:39.077 INFO DAGScheduler - Got job 87 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:39.077 INFO DAGScheduler - Final stage: ResultStage 125 (runJob at SparkHadoopWriter.scala:83)
19:48:39.077 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 124)
19:48:39.078 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 124)
19:48:39.078 INFO DAGScheduler - Submitting ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:39.095 INFO MemoryStore - Block broadcast_230 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
19:48:39.096 INFO MemoryStore - Block broadcast_230_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
19:48:39.096 INFO BlockManagerInfo - Added broadcast_230_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.1 MiB)
19:48:39.096 INFO SparkContext - Created broadcast 230 from broadcast at DAGScheduler.scala:1580
19:48:39.096 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:39.096 INFO TaskSchedulerImpl - Adding task set 124.0 with 1 tasks resource profile 0
19:48:39.097 INFO TaskSetManager - Starting task 0.0 in stage 124.0 (TID 180) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:39.097 INFO Executor - Running task 0.0 in stage 124.0 (TID 180)
19:48:39.128 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:39.145 INFO Executor - Finished task 0.0 in stage 124.0 (TID 180). 1148 bytes result sent to driver
19:48:39.145 INFO TaskSetManager - Finished task 0.0 in stage 124.0 (TID 180) in 48 ms on localhost (executor driver) (1/1)
19:48:39.146 INFO TaskSchedulerImpl - Removed TaskSet 124.0, whose tasks have all completed, from pool
19:48:39.146 INFO DAGScheduler - ShuffleMapStage 124 (mapToPair at SparkUtils.java:161) finished in 0.068 s
19:48:39.146 INFO DAGScheduler - looking for newly runnable stages
19:48:39.146 INFO DAGScheduler - running: HashSet()
19:48:39.146 INFO DAGScheduler - waiting: HashSet(ResultStage 125)
19:48:39.146 INFO DAGScheduler - failed: HashSet()
19:48:39.146 INFO DAGScheduler - Submitting ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91), which has no missing parents
19:48:39.153 INFO MemoryStore - Block broadcast_231 stored as values in memory (estimated size 241.4 KiB, free 1915.1 MiB)
19:48:39.154 INFO MemoryStore - Block broadcast_231_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.1 MiB)
19:48:39.154 INFO BlockManagerInfo - Added broadcast_231_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.1 MiB)
19:48:39.155 INFO SparkContext - Created broadcast 231 from broadcast at DAGScheduler.scala:1580
19:48:39.155 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:39.155 INFO TaskSchedulerImpl - Adding task set 125.0 with 1 tasks resource profile 0
19:48:39.155 INFO TaskSetManager - Starting task 0.0 in stage 125.0 (TID 181) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:39.156 INFO Executor - Running task 0.0 in stage 125.0 (TID 181)
19:48:39.160 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:39.160 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:39.173 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:39.173 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:39.173 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:39.174 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:39.174 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:39.174 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:39.202 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948393980170273423144465_0573_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest112253482889405764259.bam.parts/_temporary/0/task_202507151948393980170273423144465_0573_r_000000
19:48:39.202 INFO SparkHadoopMapRedUtil - attempt_202507151948393980170273423144465_0573_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:39.202 INFO Executor - Finished task 0.0 in stage 125.0 (TID 181). 1858 bytes result sent to driver
19:48:39.203 INFO TaskSetManager - Finished task 0.0 in stage 125.0 (TID 181) in 48 ms on localhost (executor driver) (1/1)
19:48:39.203 INFO TaskSchedulerImpl - Removed TaskSet 125.0, whose tasks have all completed, from pool
19:48:39.203 INFO DAGScheduler - ResultStage 125 (runJob at SparkHadoopWriter.scala:83) finished in 0.057 s
19:48:39.203 INFO DAGScheduler - Job 87 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.203 INFO TaskSchedulerImpl - Killing all running tasks in stage 125: Stage finished
19:48:39.203 INFO DAGScheduler - Job 87 finished: runJob at SparkHadoopWriter.scala:83, took 0.126664 s
19:48:39.204 INFO SparkHadoopWriter - Start to commit write Job job_202507151948393980170273423144465_0573.
19:48:39.209 INFO SparkHadoopWriter - Write Job job_202507151948393980170273423144465_0573 committed. Elapsed time: 4 ms.
19:48:39.220 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest112253482889405764259.bam
19:48:39.224 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest112253482889405764259.bam done
19:48:39.225 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest112253482889405764259.bam.parts/ to /tmp/ReadsSparkSinkUnitTest112253482889405764259.bam.sbi
19:48:39.229 INFO IndexFileMerger - Done merging .sbi files
19:48:39.230 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest112253482889405764259.bam.parts/ to /tmp/ReadsSparkSinkUnitTest112253482889405764259.bam.bai
19:48:39.235 INFO IndexFileMerger - Done merging .bai files
19:48:39.237 INFO MemoryStore - Block broadcast_232 stored as values in memory (estimated size 13.3 KiB, free 1915.1 MiB)
19:48:39.237 INFO MemoryStore - Block broadcast_232_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1915.0 MiB)
19:48:39.237 INFO BlockManagerInfo - Added broadcast_232_piece0 in memory on localhost:36125 (size: 8.3 KiB, free: 1919.0 MiB)
19:48:39.238 INFO SparkContext - Created broadcast 232 from broadcast at BamSource.java:104
19:48:39.238 INFO MemoryStore - Block broadcast_233 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
19:48:39.248 INFO BlockManagerInfo - Removed broadcast_215_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.1 MiB)
19:48:39.249 INFO BlockManagerInfo - Removed broadcast_222_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.1 MiB)
19:48:39.249 INFO BlockManagerInfo - Removed broadcast_224_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:39.250 INFO BlockManagerInfo - Removed broadcast_228_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:48:39.250 INFO BlockManagerInfo - Removed broadcast_220_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.4 MiB)
19:48:39.252 INFO BlockManagerInfo - Removed broadcast_218_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:39.252 INFO BlockManagerInfo - Removed broadcast_231_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.4 MiB)
19:48:39.253 INFO BlockManagerInfo - Removed broadcast_221_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.4 MiB)
19:48:39.253 INFO BlockManagerInfo - Removed broadcast_229_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:39.254 INFO BlockManagerInfo - Removed broadcast_227_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:39.255 INFO BlockManagerInfo - Removed broadcast_217_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:39.255 INFO MemoryStore - Block broadcast_233_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.6 MiB)
19:48:39.255 INFO BlockManagerInfo - Added broadcast_233_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:39.255 INFO BlockManagerInfo - Removed broadcast_230_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.6 MiB)
19:48:39.256 INFO BlockManagerInfo - Removed broadcast_225_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.7 MiB)
19:48:39.256 INFO SparkContext - Created broadcast 233 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.257 INFO BlockManagerInfo - Removed broadcast_219_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:39.257 INFO BlockManagerInfo - Removed broadcast_223_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.9 MiB)
19:48:39.265 INFO FileInputFormat - Total input files to process : 1
19:48:39.279 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:39.280 INFO DAGScheduler - Got job 88 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:39.280 INFO DAGScheduler - Final stage: ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:39.280 INFO DAGScheduler - Parents of final stage: List()
19:48:39.280 INFO DAGScheduler - Missing parents: List()
19:48:39.280 INFO DAGScheduler - Submitting ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:39.290 INFO MemoryStore - Block broadcast_234 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:39.291 INFO MemoryStore - Block broadcast_234_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:39.291 INFO BlockManagerInfo - Added broadcast_234_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:39.292 INFO SparkContext - Created broadcast 234 from broadcast at DAGScheduler.scala:1580
19:48:39.292 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:39.292 INFO TaskSchedulerImpl - Adding task set 126.0 with 1 tasks resource profile 0
19:48:39.292 INFO TaskSetManager - Starting task 0.0 in stage 126.0 (TID 182) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:39.293 INFO Executor - Running task 0.0 in stage 126.0 (TID 182)
19:48:39.304 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest112253482889405764259.bam:0+237038
19:48:39.309 INFO Executor - Finished task 0.0 in stage 126.0 (TID 182). 651483 bytes result sent to driver
19:48:39.310 INFO TaskSetManager - Finished task 0.0 in stage 126.0 (TID 182) in 18 ms on localhost (executor driver) (1/1)
19:48:39.310 INFO TaskSchedulerImpl - Removed TaskSet 126.0, whose tasks have all completed, from pool
19:48:39.310 INFO DAGScheduler - ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
19:48:39.310 INFO DAGScheduler - Job 88 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.310 INFO TaskSchedulerImpl - Killing all running tasks in stage 126: Stage finished
19:48:39.311 INFO DAGScheduler - Job 88 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.031152 s
19:48:39.324 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:39.325 INFO DAGScheduler - Got job 89 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:39.325 INFO DAGScheduler - Final stage: ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185)
19:48:39.325 INFO DAGScheduler - Parents of final stage: List()
19:48:39.325 INFO DAGScheduler - Missing parents: List()
19:48:39.325 INFO DAGScheduler - Submitting ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:39.341 INFO MemoryStore - Block broadcast_235 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:39.343 INFO MemoryStore - Block broadcast_235_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.5 MiB)
19:48:39.343 INFO BlockManagerInfo - Added broadcast_235_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:39.343 INFO SparkContext - Created broadcast 235 from broadcast at DAGScheduler.scala:1580
19:48:39.343 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:39.343 INFO TaskSchedulerImpl - Adding task set 127.0 with 1 tasks resource profile 0
19:48:39.344 INFO TaskSetManager - Starting task 0.0 in stage 127.0 (TID 183) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:39.344 INFO Executor - Running task 0.0 in stage 127.0 (TID 183)
19:48:39.375 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:39.385 INFO Executor - Finished task 0.0 in stage 127.0 (TID 183). 989 bytes result sent to driver
19:48:39.385 INFO TaskSetManager - Finished task 0.0 in stage 127.0 (TID 183) in 41 ms on localhost (executor driver) (1/1)
19:48:39.386 INFO TaskSchedulerImpl - Removed TaskSet 127.0, whose tasks have all completed, from pool
19:48:39.386 INFO DAGScheduler - ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.061 s
19:48:39.386 INFO DAGScheduler - Job 89 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.386 INFO TaskSchedulerImpl - Killing all running tasks in stage 127: Stage finished
19:48:39.386 INFO DAGScheduler - Job 89 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.061380 s
19:48:39.389 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:39.389 INFO DAGScheduler - Got job 90 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:39.389 INFO DAGScheduler - Final stage: ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185)
19:48:39.389 INFO DAGScheduler - Parents of final stage: List()
19:48:39.389 INFO DAGScheduler - Missing parents: List()
19:48:39.390 INFO DAGScheduler - Submitting ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:39.396 INFO MemoryStore - Block broadcast_236 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:39.396 INFO MemoryStore - Block broadcast_236_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.3 MiB)
19:48:39.397 INFO BlockManagerInfo - Added broadcast_236_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.6 MiB)
19:48:39.397 INFO SparkContext - Created broadcast 236 from broadcast at DAGScheduler.scala:1580
19:48:39.397 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:39.397 INFO TaskSchedulerImpl - Adding task set 128.0 with 1 tasks resource profile 0
19:48:39.397 INFO TaskSetManager - Starting task 0.0 in stage 128.0 (TID 184) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:39.398 INFO Executor - Running task 0.0 in stage 128.0 (TID 184)
19:48:39.408 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest112253482889405764259.bam:0+237038
19:48:39.411 INFO Executor - Finished task 0.0 in stage 128.0 (TID 184). 989 bytes result sent to driver
19:48:39.412 INFO TaskSetManager - Finished task 0.0 in stage 128.0 (TID 184) in 15 ms on localhost (executor driver) (1/1)
19:48:39.412 INFO TaskSchedulerImpl - Removed TaskSet 128.0, whose tasks have all completed, from pool
19:48:39.412 INFO DAGScheduler - ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
19:48:39.412 INFO DAGScheduler - Job 90 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.412 INFO TaskSchedulerImpl - Killing all running tasks in stage 128: Stage finished
19:48:39.412 INFO DAGScheduler - Job 90 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022948 s
19:48:39.416 INFO MemoryStore - Block broadcast_237 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
19:48:39.426 INFO MemoryStore - Block broadcast_237_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:39.426 INFO BlockManagerInfo - Added broadcast_237_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:39.426 INFO SparkContext - Created broadcast 237 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.448 INFO MemoryStore - Block broadcast_238 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
19:48:39.454 INFO MemoryStore - Block broadcast_238_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:39.454 INFO BlockManagerInfo - Added broadcast_238_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:39.454 INFO SparkContext - Created broadcast 238 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.474 INFO FileInputFormat - Total input files to process : 1
19:48:39.475 INFO MemoryStore - Block broadcast_239 stored as values in memory (estimated size 160.7 KiB, free 1917.5 MiB)
19:48:39.476 INFO MemoryStore - Block broadcast_239_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
19:48:39.476 INFO BlockManagerInfo - Added broadcast_239_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:39.476 INFO SparkContext - Created broadcast 239 from broadcast at ReadsSparkSink.java:133
19:48:39.477 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:39.478 INFO MemoryStore - Block broadcast_240 stored as values in memory (estimated size 163.2 KiB, free 1917.3 MiB)
19:48:39.478 INFO MemoryStore - Block broadcast_240_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
19:48:39.478 INFO BlockManagerInfo - Added broadcast_240_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:39.479 INFO SparkContext - Created broadcast 240 from broadcast at BamSink.java:76
19:48:39.480 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:39.480 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:39.480 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:39.498 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:39.498 INFO DAGScheduler - Registering RDD 593 (mapToPair at SparkUtils.java:161) as input to shuffle 27
19:48:39.499 INFO DAGScheduler - Got job 91 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:39.499 INFO DAGScheduler - Final stage: ResultStage 130 (runJob at SparkHadoopWriter.scala:83)
19:48:39.499 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 129)
19:48:39.499 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 129)
19:48:39.499 INFO DAGScheduler - Submitting ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:39.519 INFO MemoryStore - Block broadcast_241 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
19:48:39.521 INFO MemoryStore - Block broadcast_241_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
19:48:39.521 INFO BlockManagerInfo - Added broadcast_241_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.4 MiB)
19:48:39.521 INFO SparkContext - Created broadcast 241 from broadcast at DAGScheduler.scala:1580
19:48:39.521 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:39.521 INFO TaskSchedulerImpl - Adding task set 129.0 with 1 tasks resource profile 0
19:48:39.522 INFO TaskSetManager - Starting task 0.0 in stage 129.0 (TID 185) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:39.522 INFO Executor - Running task 0.0 in stage 129.0 (TID 185)
19:48:39.551 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:39.566 INFO Executor - Finished task 0.0 in stage 129.0 (TID 185). 1148 bytes result sent to driver
19:48:39.566 INFO TaskSetManager - Finished task 0.0 in stage 129.0 (TID 185) in 44 ms on localhost (executor driver) (1/1)
19:48:39.567 INFO TaskSchedulerImpl - Removed TaskSet 129.0, whose tasks have all completed, from pool
19:48:39.567 INFO DAGScheduler - ShuffleMapStage 129 (mapToPair at SparkUtils.java:161) finished in 0.068 s
19:48:39.567 INFO DAGScheduler - looking for newly runnable stages
19:48:39.567 INFO DAGScheduler - running: HashSet()
19:48:39.567 INFO DAGScheduler - waiting: HashSet(ResultStage 130)
19:48:39.567 INFO DAGScheduler - failed: HashSet()
19:48:39.567 INFO DAGScheduler - Submitting ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91), which has no missing parents
19:48:39.573 INFO MemoryStore - Block broadcast_242 stored as values in memory (estimated size 241.4 KiB, free 1916.4 MiB)
19:48:39.574 INFO MemoryStore - Block broadcast_242_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.4 MiB)
19:48:39.574 INFO BlockManagerInfo - Added broadcast_242_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.3 MiB)
19:48:39.575 INFO SparkContext - Created broadcast 242 from broadcast at DAGScheduler.scala:1580
19:48:39.575 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:39.575 INFO TaskSchedulerImpl - Adding task set 130.0 with 1 tasks resource profile 0
19:48:39.576 INFO TaskSetManager - Starting task 0.0 in stage 130.0 (TID 186) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:39.576 INFO Executor - Running task 0.0 in stage 130.0 (TID 186)
19:48:39.582 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:39.582 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:39.598 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:39.598 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:39.598 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:39.599 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:39.599 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:39.599 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:39.619 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948391307637224937406705_0598_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest115396565518379678291.bam.parts/_temporary/0/task_202507151948391307637224937406705_0598_r_000000
19:48:39.619 INFO SparkHadoopMapRedUtil - attempt_202507151948391307637224937406705_0598_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:39.620 INFO Executor - Finished task 0.0 in stage 130.0 (TID 186). 1858 bytes result sent to driver
19:48:39.620 INFO TaskSetManager - Finished task 0.0 in stage 130.0 (TID 186) in 45 ms on localhost (executor driver) (1/1)
19:48:39.620 INFO TaskSchedulerImpl - Removed TaskSet 130.0, whose tasks have all completed, from pool
19:48:39.620 INFO DAGScheduler - ResultStage 130 (runJob at SparkHadoopWriter.scala:83) finished in 0.053 s
19:48:39.620 INFO DAGScheduler - Job 91 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.621 INFO TaskSchedulerImpl - Killing all running tasks in stage 130: Stage finished
19:48:39.621 INFO DAGScheduler - Job 91 finished: runJob at SparkHadoopWriter.scala:83, took 0.122775 s
19:48:39.621 INFO SparkHadoopWriter - Start to commit write Job job_202507151948391307637224937406705_0598.
19:48:39.626 INFO SparkHadoopWriter - Write Job job_202507151948391307637224937406705_0598 committed. Elapsed time: 4 ms.
19:48:39.637 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest115396565518379678291.bam
19:48:39.642 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest115396565518379678291.bam done
19:48:39.642 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest115396565518379678291.bam.parts/ to /tmp/ReadsSparkSinkUnitTest115396565518379678291.bam.bai
19:48:39.647 INFO IndexFileMerger - Done merging .bai files
19:48:39.649 INFO MemoryStore - Block broadcast_243 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
19:48:39.655 INFO MemoryStore - Block broadcast_243_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
19:48:39.656 INFO BlockManagerInfo - Added broadcast_243_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:39.656 INFO SparkContext - Created broadcast 243 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.676 INFO FileInputFormat - Total input files to process : 1
19:48:39.711 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:39.712 INFO DAGScheduler - Got job 92 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:39.712 INFO DAGScheduler - Final stage: ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:39.712 INFO DAGScheduler - Parents of final stage: List()
19:48:39.712 INFO DAGScheduler - Missing parents: List()
19:48:39.712 INFO DAGScheduler - Submitting ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:39.729 INFO MemoryStore - Block broadcast_244 stored as values in memory (estimated size 426.2 KiB, free 1915.6 MiB)
19:48:39.730 INFO MemoryStore - Block broadcast_244_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1915.4 MiB)
19:48:39.730 INFO BlockManagerInfo - Added broadcast_244_piece0 in memory on localhost:36125 (size: 153.7 KiB, free: 1919.1 MiB)
19:48:39.730 INFO SparkContext - Created broadcast 244 from broadcast at DAGScheduler.scala:1580
19:48:39.731 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:39.731 INFO TaskSchedulerImpl - Adding task set 131.0 with 1 tasks resource profile 0
19:48:39.731 INFO TaskSetManager - Starting task 0.0 in stage 131.0 (TID 187) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:39.732 INFO Executor - Running task 0.0 in stage 131.0 (TID 187)
19:48:39.760 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115396565518379678291.bam:0+237038
19:48:39.773 INFO Executor - Finished task 0.0 in stage 131.0 (TID 187). 651483 bytes result sent to driver
19:48:39.776 INFO TaskSetManager - Finished task 0.0 in stage 131.0 (TID 187) in 45 ms on localhost (executor driver) (1/1)
19:48:39.776 INFO TaskSchedulerImpl - Removed TaskSet 131.0, whose tasks have all completed, from pool
19:48:39.776 INFO DAGScheduler - ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.064 s
19:48:39.776 INFO DAGScheduler - Job 92 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.776 INFO TaskSchedulerImpl - Killing all running tasks in stage 131: Stage finished
19:48:39.776 INFO DAGScheduler - Job 92 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.064697 s
19:48:39.791 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:39.791 INFO BlockManagerInfo - Removed broadcast_233_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.1 MiB)
19:48:39.792 INFO DAGScheduler - Got job 93 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:39.792 INFO DAGScheduler - Final stage: ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185)
19:48:39.792 INFO DAGScheduler - Parents of final stage: List()
19:48:39.792 INFO DAGScheduler - Missing parents: List()
19:48:39.792 INFO DAGScheduler - Submitting ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:39.792 INFO BlockManagerInfo - Removed broadcast_238_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:39.793 INFO BlockManagerInfo - Removed broadcast_239_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.2 MiB)
19:48:39.794 INFO BlockManagerInfo - Removed broadcast_244_piece0 on localhost:36125 in memory (size: 153.7 KiB, free: 1919.4 MiB)
19:48:39.795 INFO BlockManagerInfo - Removed broadcast_232_piece0 on localhost:36125 in memory (size: 8.3 KiB, free: 1919.4 MiB)
19:48:39.795 INFO BlockManagerInfo - Removed broadcast_235_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:39.796 INFO BlockManagerInfo - Removed broadcast_242_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.6 MiB)
19:48:39.796 INFO BlockManagerInfo - Removed broadcast_236_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.6 MiB)
19:48:39.797 INFO BlockManagerInfo - Removed broadcast_240_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:39.797 INFO BlockManagerInfo - Removed broadcast_241_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:39.797 INFO BlockManagerInfo - Removed broadcast_226_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:39.798 INFO BlockManagerInfo - Removed broadcast_234_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.9 MiB)
19:48:39.812 INFO MemoryStore - Block broadcast_245 stored as values in memory (estimated size 426.1 KiB, free 1918.9 MiB)
19:48:39.813 INFO MemoryStore - Block broadcast_245_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.8 MiB)
19:48:39.813 INFO BlockManagerInfo - Added broadcast_245_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.8 MiB)
19:48:39.813 INFO SparkContext - Created broadcast 245 from broadcast at DAGScheduler.scala:1580
19:48:39.814 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:39.814 INFO TaskSchedulerImpl - Adding task set 132.0 with 1 tasks resource profile 0
19:48:39.814 INFO TaskSetManager - Starting task 0.0 in stage 132.0 (TID 188) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:39.814 INFO Executor - Running task 0.0 in stage 132.0 (TID 188)
19:48:39.844 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:39.854 INFO Executor - Finished task 0.0 in stage 132.0 (TID 188). 989 bytes result sent to driver
19:48:39.854 INFO TaskSetManager - Finished task 0.0 in stage 132.0 (TID 188) in 40 ms on localhost (executor driver) (1/1)
19:48:39.854 INFO TaskSchedulerImpl - Removed TaskSet 132.0, whose tasks have all completed, from pool
19:48:39.854 INFO DAGScheduler - ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.062 s
19:48:39.854 INFO DAGScheduler - Job 93 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.854 INFO TaskSchedulerImpl - Killing all running tasks in stage 132: Stage finished
19:48:39.854 INFO DAGScheduler - Job 93 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063182 s
19:48:39.859 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:39.859 INFO DAGScheduler - Got job 94 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:39.859 INFO DAGScheduler - Final stage: ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185)
19:48:39.859 INFO DAGScheduler - Parents of final stage: List()
19:48:39.859 INFO DAGScheduler - Missing parents: List()
19:48:39.859 INFO DAGScheduler - Submitting ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:39.876 INFO MemoryStore - Block broadcast_246 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
19:48:39.878 INFO MemoryStore - Block broadcast_246_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
19:48:39.878 INFO BlockManagerInfo - Added broadcast_246_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.6 MiB)
19:48:39.878 INFO SparkContext - Created broadcast 246 from broadcast at DAGScheduler.scala:1580
19:48:39.878 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:39.878 INFO TaskSchedulerImpl - Adding task set 133.0 with 1 tasks resource profile 0
19:48:39.879 INFO TaskSetManager - Starting task 0.0 in stage 133.0 (TID 189) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:39.879 INFO Executor - Running task 0.0 in stage 133.0 (TID 189)
19:48:39.908 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115396565518379678291.bam:0+237038
19:48:39.920 INFO Executor - Finished task 0.0 in stage 133.0 (TID 189). 989 bytes result sent to driver
19:48:39.920 INFO TaskSetManager - Finished task 0.0 in stage 133.0 (TID 189) in 41 ms on localhost (executor driver) (1/1)
19:48:39.920 INFO TaskSchedulerImpl - Removed TaskSet 133.0, whose tasks have all completed, from pool
19:48:39.920 INFO DAGScheduler - ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
19:48:39.920 INFO DAGScheduler - Job 94 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:39.920 INFO TaskSchedulerImpl - Killing all running tasks in stage 133: Stage finished
19:48:39.920 INFO DAGScheduler - Job 94 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.061406 s
19:48:39.924 INFO MemoryStore - Block broadcast_247 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
19:48:39.931 INFO MemoryStore - Block broadcast_247_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
19:48:39.931 INFO BlockManagerInfo - Added broadcast_247_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:39.931 INFO SparkContext - Created broadcast 247 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.953 INFO MemoryStore - Block broadcast_248 stored as values in memory (estimated size 297.9 KiB, free 1917.6 MiB)
19:48:39.964 INFO MemoryStore - Block broadcast_248_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
19:48:39.964 INFO BlockManagerInfo - Added broadcast_248_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:39.964 INFO SparkContext - Created broadcast 248 from newAPIHadoopFile at PathSplitSource.java:96
19:48:39.984 INFO FileInputFormat - Total input files to process : 1
19:48:39.986 INFO MemoryStore - Block broadcast_249 stored as values in memory (estimated size 160.7 KiB, free 1917.4 MiB)
19:48:39.987 INFO MemoryStore - Block broadcast_249_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
19:48:39.987 INFO BlockManagerInfo - Added broadcast_249_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:39.988 INFO SparkContext - Created broadcast 249 from broadcast at ReadsSparkSink.java:133
19:48:39.988 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:39.989 INFO MemoryStore - Block broadcast_250 stored as values in memory (estimated size 163.2 KiB, free 1917.2 MiB)
19:48:39.990 INFO MemoryStore - Block broadcast_250_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.2 MiB)
19:48:39.990 INFO BlockManagerInfo - Added broadcast_250_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:39.991 INFO SparkContext - Created broadcast 250 from broadcast at BamSink.java:76
19:48:39.993 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:39.993 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:39.993 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.012 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:40.013 INFO DAGScheduler - Registering RDD 619 (mapToPair at SparkUtils.java:161) as input to shuffle 28
19:48:40.013 INFO DAGScheduler - Got job 95 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:40.013 INFO DAGScheduler - Final stage: ResultStage 135 (runJob at SparkHadoopWriter.scala:83)
19:48:40.013 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 134)
19:48:40.013 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 134)
19:48:40.013 INFO DAGScheduler - Submitting ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:40.031 INFO MemoryStore - Block broadcast_251 stored as values in memory (estimated size 520.4 KiB, free 1916.7 MiB)
19:48:40.032 INFO MemoryStore - Block broadcast_251_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.5 MiB)
19:48:40.032 INFO BlockManagerInfo - Added broadcast_251_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.3 MiB)
19:48:40.032 INFO SparkContext - Created broadcast 251 from broadcast at DAGScheduler.scala:1580
19:48:40.033 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:40.033 INFO TaskSchedulerImpl - Adding task set 134.0 with 1 tasks resource profile 0
19:48:40.033 INFO TaskSetManager - Starting task 0.0 in stage 134.0 (TID 190) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:40.033 INFO Executor - Running task 0.0 in stage 134.0 (TID 190)
19:48:40.063 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:40.080 INFO Executor - Finished task 0.0 in stage 134.0 (TID 190). 1148 bytes result sent to driver
19:48:40.080 INFO TaskSetManager - Finished task 0.0 in stage 134.0 (TID 190) in 47 ms on localhost (executor driver) (1/1)
19:48:40.080 INFO TaskSchedulerImpl - Removed TaskSet 134.0, whose tasks have all completed, from pool
19:48:40.080 INFO DAGScheduler - ShuffleMapStage 134 (mapToPair at SparkUtils.java:161) finished in 0.067 s
19:48:40.080 INFO DAGScheduler - looking for newly runnable stages
19:48:40.080 INFO DAGScheduler - running: HashSet()
19:48:40.080 INFO DAGScheduler - waiting: HashSet(ResultStage 135)
19:48:40.080 INFO DAGScheduler - failed: HashSet()
19:48:40.081 INFO DAGScheduler - Submitting ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91), which has no missing parents
19:48:40.087 INFO MemoryStore - Block broadcast_252 stored as values in memory (estimated size 241.4 KiB, free 1916.3 MiB)
19:48:40.088 INFO MemoryStore - Block broadcast_252_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.2 MiB)
19:48:40.088 INFO BlockManagerInfo - Added broadcast_252_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.3 MiB)
19:48:40.089 INFO SparkContext - Created broadcast 252 from broadcast at DAGScheduler.scala:1580
19:48:40.089 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:40.089 INFO TaskSchedulerImpl - Adding task set 135.0 with 1 tasks resource profile 0
19:48:40.089 INFO TaskSetManager - Starting task 0.0 in stage 135.0 (TID 191) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:40.089 INFO Executor - Running task 0.0 in stage 135.0 (TID 191)
19:48:40.094 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:40.094 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:40.105 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.105 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.105 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.105 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.105 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.105 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.123 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948391184707137580438982_0624_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest13443134896336220829.bam.parts/_temporary/0/task_202507151948391184707137580438982_0624_r_000000
19:48:40.123 INFO SparkHadoopMapRedUtil - attempt_202507151948391184707137580438982_0624_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:40.123 INFO Executor - Finished task 0.0 in stage 135.0 (TID 191). 1858 bytes result sent to driver
19:48:40.124 INFO TaskSetManager - Finished task 0.0 in stage 135.0 (TID 191) in 35 ms on localhost (executor driver) (1/1)
19:48:40.124 INFO TaskSchedulerImpl - Removed TaskSet 135.0, whose tasks have all completed, from pool
19:48:40.124 INFO DAGScheduler - ResultStage 135 (runJob at SparkHadoopWriter.scala:83) finished in 0.043 s
19:48:40.124 INFO DAGScheduler - Job 95 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.124 INFO TaskSchedulerImpl - Killing all running tasks in stage 135: Stage finished
19:48:40.124 INFO DAGScheduler - Job 95 finished: runJob at SparkHadoopWriter.scala:83, took 0.111832 s
19:48:40.124 INFO SparkHadoopWriter - Start to commit write Job job_202507151948391184707137580438982_0624.
19:48:40.130 INFO SparkHadoopWriter - Write Job job_202507151948391184707137580438982_0624 committed. Elapsed time: 5 ms.
19:48:40.143 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest13443134896336220829.bam
19:48:40.147 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest13443134896336220829.bam done
19:48:40.147 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest13443134896336220829.bam.parts/ to /tmp/ReadsSparkSinkUnitTest13443134896336220829.bam.sbi
19:48:40.152 INFO IndexFileMerger - Done merging .sbi files
19:48:40.154 INFO MemoryStore - Block broadcast_253 stored as values in memory (estimated size 320.0 B, free 1916.2 MiB)
19:48:40.154 INFO MemoryStore - Block broadcast_253_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.2 MiB)
19:48:40.154 INFO BlockManagerInfo - Added broadcast_253_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.3 MiB)
19:48:40.154 INFO SparkContext - Created broadcast 253 from broadcast at BamSource.java:104
19:48:40.156 INFO MemoryStore - Block broadcast_254 stored as values in memory (estimated size 297.9 KiB, free 1915.9 MiB)
19:48:40.162 INFO MemoryStore - Block broadcast_254_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.9 MiB)
19:48:40.163 INFO BlockManagerInfo - Added broadcast_254_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:40.163 INFO SparkContext - Created broadcast 254 from newAPIHadoopFile at PathSplitSource.java:96
19:48:40.171 INFO FileInputFormat - Total input files to process : 1
19:48:40.185 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:40.186 INFO DAGScheduler - Got job 96 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:40.186 INFO DAGScheduler - Final stage: ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:40.186 INFO DAGScheduler - Parents of final stage: List()
19:48:40.186 INFO DAGScheduler - Missing parents: List()
19:48:40.186 INFO DAGScheduler - Submitting ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:40.192 INFO MemoryStore - Block broadcast_255 stored as values in memory (estimated size 148.2 KiB, free 1915.7 MiB)
19:48:40.192 INFO MemoryStore - Block broadcast_255_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.7 MiB)
19:48:40.192 INFO BlockManagerInfo - Added broadcast_255_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.2 MiB)
19:48:40.193 INFO SparkContext - Created broadcast 255 from broadcast at DAGScheduler.scala:1580
19:48:40.193 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:40.193 INFO TaskSchedulerImpl - Adding task set 136.0 with 1 tasks resource profile 0
19:48:40.193 INFO TaskSetManager - Starting task 0.0 in stage 136.0 (TID 192) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:40.194 INFO Executor - Running task 0.0 in stage 136.0 (TID 192)
19:48:40.207 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13443134896336220829.bam:0+237038
19:48:40.211 INFO Executor - Finished task 0.0 in stage 136.0 (TID 192). 651483 bytes result sent to driver
19:48:40.213 INFO TaskSetManager - Finished task 0.0 in stage 136.0 (TID 192) in 20 ms on localhost (executor driver) (1/1)
19:48:40.213 INFO TaskSchedulerImpl - Removed TaskSet 136.0, whose tasks have all completed, from pool
19:48:40.213 INFO DAGScheduler - ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
19:48:40.213 INFO DAGScheduler - Job 96 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.213 INFO TaskSchedulerImpl - Killing all running tasks in stage 136: Stage finished
19:48:40.213 INFO DAGScheduler - Job 96 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.027849 s
19:48:40.223 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:40.223 INFO DAGScheduler - Got job 97 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:40.223 INFO DAGScheduler - Final stage: ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185)
19:48:40.223 INFO DAGScheduler - Parents of final stage: List()
19:48:40.223 INFO DAGScheduler - Missing parents: List()
19:48:40.223 INFO DAGScheduler - Submitting ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:40.239 INFO MemoryStore - Block broadcast_256 stored as values in memory (estimated size 426.1 KiB, free 1915.2 MiB)
19:48:40.241 INFO MemoryStore - Block broadcast_256_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
19:48:40.241 INFO BlockManagerInfo - Added broadcast_256_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.0 MiB)
19:48:40.241 INFO SparkContext - Created broadcast 256 from broadcast at DAGScheduler.scala:1580
19:48:40.241 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:40.241 INFO TaskSchedulerImpl - Adding task set 137.0 with 1 tasks resource profile 0
19:48:40.242 INFO TaskSetManager - Starting task 0.0 in stage 137.0 (TID 193) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:40.242 INFO Executor - Running task 0.0 in stage 137.0 (TID 193)
19:48:40.276 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:40.286 INFO Executor - Finished task 0.0 in stage 137.0 (TID 193). 989 bytes result sent to driver
19:48:40.286 INFO TaskSetManager - Finished task 0.0 in stage 137.0 (TID 193) in 44 ms on localhost (executor driver) (1/1)
19:48:40.286 INFO TaskSchedulerImpl - Removed TaskSet 137.0, whose tasks have all completed, from pool
19:48:40.287 INFO DAGScheduler - ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.064 s
19:48:40.287 INFO DAGScheduler - Job 97 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.287 INFO TaskSchedulerImpl - Killing all running tasks in stage 137: Stage finished
19:48:40.287 INFO DAGScheduler - Job 97 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.064195 s
19:48:40.290 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:40.290 INFO DAGScheduler - Got job 98 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:40.290 INFO DAGScheduler - Final stage: ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185)
19:48:40.290 INFO DAGScheduler - Parents of final stage: List()
19:48:40.290 INFO DAGScheduler - Missing parents: List()
19:48:40.290 INFO DAGScheduler - Submitting ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:40.297 INFO MemoryStore - Block broadcast_257 stored as values in memory (estimated size 148.1 KiB, free 1915.0 MiB)
19:48:40.297 INFO MemoryStore - Block broadcast_257_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1914.9 MiB)
19:48:40.297 INFO BlockManagerInfo - Added broadcast_257_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.0 MiB)
19:48:40.298 INFO SparkContext - Created broadcast 257 from broadcast at DAGScheduler.scala:1580
19:48:40.298 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:40.298 INFO TaskSchedulerImpl - Adding task set 138.0 with 1 tasks resource profile 0
19:48:40.298 INFO TaskSetManager - Starting task 0.0 in stage 138.0 (TID 194) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:40.299 INFO Executor - Running task 0.0 in stage 138.0 (TID 194)
19:48:40.310 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13443134896336220829.bam:0+237038
19:48:40.319 INFO Executor - Finished task 0.0 in stage 138.0 (TID 194). 1075 bytes result sent to driver
19:48:40.319 INFO BlockManagerInfo - Removed broadcast_256_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.1 MiB)
19:48:40.320 INFO TaskSetManager - Finished task 0.0 in stage 138.0 (TID 194) in 22 ms on localhost (executor driver) (1/1)
19:48:40.320 INFO TaskSchedulerImpl - Removed TaskSet 138.0, whose tasks have all completed, from pool
19:48:40.320 INFO DAGScheduler - ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.029 s
19:48:40.320 INFO DAGScheduler - Job 98 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.320 INFO TaskSchedulerImpl - Killing all running tasks in stage 138: Stage finished
19:48:40.320 INFO DAGScheduler - Job 98 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.030148 s
19:48:40.321 INFO BlockManagerInfo - Removed broadcast_255_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.2 MiB)
19:48:40.322 INFO BlockManagerInfo - Removed broadcast_243_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:40.323 INFO BlockManagerInfo - Removed broadcast_248_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:40.323 INFO BlockManagerInfo - Removed broadcast_252_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.3 MiB)
19:48:40.324 INFO BlockManagerInfo - Removed broadcast_250_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:48:40.324 INFO MemoryStore - Block broadcast_258 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:40.325 INFO BlockManagerInfo - Removed broadcast_251_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.5 MiB)
19:48:40.325 INFO BlockManagerInfo - Removed broadcast_246_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:48:40.326 INFO BlockManagerInfo - Removed broadcast_245_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:40.326 INFO BlockManagerInfo - Removed broadcast_249_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.8 MiB)
19:48:40.327 INFO BlockManagerInfo - Removed broadcast_237_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:40.332 INFO MemoryStore - Block broadcast_258_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
19:48:40.332 INFO BlockManagerInfo - Added broadcast_258_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.8 MiB)
19:48:40.332 INFO SparkContext - Created broadcast 258 from newAPIHadoopFile at PathSplitSource.java:96
19:48:40.354 INFO MemoryStore - Block broadcast_259 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
19:48:40.360 INFO MemoryStore - Block broadcast_259_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
19:48:40.360 INFO BlockManagerInfo - Added broadcast_259_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.8 MiB)
19:48:40.360 INFO SparkContext - Created broadcast 259 from newAPIHadoopFile at PathSplitSource.java:96
19:48:40.380 INFO FileInputFormat - Total input files to process : 1
19:48:40.381 INFO MemoryStore - Block broadcast_260 stored as values in memory (estimated size 160.7 KiB, free 1918.3 MiB)
19:48:40.382 INFO MemoryStore - Block broadcast_260_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.3 MiB)
19:48:40.382 INFO BlockManagerInfo - Added broadcast_260_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.7 MiB)
19:48:40.382 INFO SparkContext - Created broadcast 260 from broadcast at ReadsSparkSink.java:133
19:48:40.383 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:40.383 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:40.384 INFO MemoryStore - Block broadcast_261 stored as values in memory (estimated size 163.2 KiB, free 1918.1 MiB)
19:48:40.384 INFO MemoryStore - Block broadcast_261_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.1 MiB)
19:48:40.384 INFO BlockManagerInfo - Added broadcast_261_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.7 MiB)
19:48:40.384 INFO SparkContext - Created broadcast 261 from broadcast at BamSink.java:76
19:48:40.386 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.386 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.386 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.407 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:40.407 INFO DAGScheduler - Registering RDD 644 (mapToPair at SparkUtils.java:161) as input to shuffle 29
19:48:40.408 INFO DAGScheduler - Got job 99 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:40.408 INFO DAGScheduler - Final stage: ResultStage 140 (runJob at SparkHadoopWriter.scala:83)
19:48:40.408 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 139)
19:48:40.408 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 139)
19:48:40.408 INFO DAGScheduler - Submitting ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:40.426 INFO MemoryStore - Block broadcast_262 stored as values in memory (estimated size 520.4 KiB, free 1917.6 MiB)
19:48:40.427 INFO MemoryStore - Block broadcast_262_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.4 MiB)
19:48:40.427 INFO BlockManagerInfo - Added broadcast_262_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.6 MiB)
19:48:40.427 INFO SparkContext - Created broadcast 262 from broadcast at DAGScheduler.scala:1580
19:48:40.428 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:40.428 INFO TaskSchedulerImpl - Adding task set 139.0 with 1 tasks resource profile 0
19:48:40.428 INFO TaskSetManager - Starting task 0.0 in stage 139.0 (TID 195) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:40.428 INFO Executor - Running task 0.0 in stage 139.0 (TID 195)
19:48:40.459 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:40.474 INFO Executor - Finished task 0.0 in stage 139.0 (TID 195). 1148 bytes result sent to driver
19:48:40.475 INFO TaskSetManager - Finished task 0.0 in stage 139.0 (TID 195) in 47 ms on localhost (executor driver) (1/1)
19:48:40.475 INFO TaskSchedulerImpl - Removed TaskSet 139.0, whose tasks have all completed, from pool
19:48:40.475 INFO DAGScheduler - ShuffleMapStage 139 (mapToPair at SparkUtils.java:161) finished in 0.067 s
19:48:40.475 INFO DAGScheduler - looking for newly runnable stages
19:48:40.475 INFO DAGScheduler - running: HashSet()
19:48:40.475 INFO DAGScheduler - waiting: HashSet(ResultStage 140)
19:48:40.475 INFO DAGScheduler - failed: HashSet()
19:48:40.475 INFO DAGScheduler - Submitting ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91), which has no missing parents
19:48:40.486 INFO MemoryStore - Block broadcast_263 stored as values in memory (estimated size 241.4 KiB, free 1917.2 MiB)
19:48:40.487 INFO MemoryStore - Block broadcast_263_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1917.1 MiB)
19:48:40.487 INFO BlockManagerInfo - Added broadcast_263_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.5 MiB)
19:48:40.488 INFO SparkContext - Created broadcast 263 from broadcast at DAGScheduler.scala:1580
19:48:40.488 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:40.488 INFO TaskSchedulerImpl - Adding task set 140.0 with 1 tasks resource profile 0
19:48:40.488 INFO TaskSetManager - Starting task 0.0 in stage 140.0 (TID 196) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:40.489 INFO Executor - Running task 0.0 in stage 140.0 (TID 196)
19:48:40.493 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:40.493 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:40.504 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.504 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.504 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.505 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.505 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.505 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.520 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948406132058786638802290_0649_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest11937960889066033099.bam.parts/_temporary/0/task_202507151948406132058786638802290_0649_r_000000
19:48:40.520 INFO SparkHadoopMapRedUtil - attempt_202507151948406132058786638802290_0649_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:40.521 INFO Executor - Finished task 0.0 in stage 140.0 (TID 196). 1858 bytes result sent to driver
19:48:40.521 INFO TaskSetManager - Finished task 0.0 in stage 140.0 (TID 196) in 33 ms on localhost (executor driver) (1/1)
19:48:40.521 INFO TaskSchedulerImpl - Removed TaskSet 140.0, whose tasks have all completed, from pool
19:48:40.522 INFO DAGScheduler - ResultStage 140 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
19:48:40.522 INFO DAGScheduler - Job 99 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.522 INFO TaskSchedulerImpl - Killing all running tasks in stage 140: Stage finished
19:48:40.522 INFO DAGScheduler - Job 99 finished: runJob at SparkHadoopWriter.scala:83, took 0.114867 s
19:48:40.522 INFO SparkHadoopWriter - Start to commit write Job job_202507151948406132058786638802290_0649.
19:48:40.527 INFO SparkHadoopWriter - Write Job job_202507151948406132058786638802290_0649 committed. Elapsed time: 4 ms.
19:48:40.538 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest11937960889066033099.bam
19:48:40.543 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest11937960889066033099.bam done
19:48:40.545 INFO MemoryStore - Block broadcast_264 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
19:48:40.551 INFO MemoryStore - Block broadcast_264_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
19:48:40.551 INFO BlockManagerInfo - Added broadcast_264_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:40.551 INFO SparkContext - Created broadcast 264 from newAPIHadoopFile at PathSplitSource.java:96
19:48:40.571 INFO FileInputFormat - Total input files to process : 1
19:48:40.605 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:40.606 INFO DAGScheduler - Got job 100 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:40.606 INFO DAGScheduler - Final stage: ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:40.606 INFO DAGScheduler - Parents of final stage: List()
19:48:40.606 INFO DAGScheduler - Missing parents: List()
19:48:40.606 INFO DAGScheduler - Submitting ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:40.623 INFO MemoryStore - Block broadcast_265 stored as values in memory (estimated size 426.2 KiB, free 1916.4 MiB)
19:48:40.624 INFO MemoryStore - Block broadcast_265_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.2 MiB)
19:48:40.624 INFO BlockManagerInfo - Added broadcast_265_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.3 MiB)
19:48:40.624 INFO SparkContext - Created broadcast 265 from broadcast at DAGScheduler.scala:1580
19:48:40.624 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:40.624 INFO TaskSchedulerImpl - Adding task set 141.0 with 1 tasks resource profile 0
19:48:40.625 INFO TaskSetManager - Starting task 0.0 in stage 141.0 (TID 197) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:40.625 INFO Executor - Running task 0.0 in stage 141.0 (TID 197)
19:48:40.655 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest11937960889066033099.bam:0+237038
19:48:40.667 INFO Executor - Finished task 0.0 in stage 141.0 (TID 197). 651483 bytes result sent to driver
19:48:40.668 INFO TaskSetManager - Finished task 0.0 in stage 141.0 (TID 197) in 43 ms on localhost (executor driver) (1/1)
19:48:40.668 INFO TaskSchedulerImpl - Removed TaskSet 141.0, whose tasks have all completed, from pool
19:48:40.668 INFO DAGScheduler - ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.062 s
19:48:40.668 INFO DAGScheduler - Job 100 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.668 INFO TaskSchedulerImpl - Killing all running tasks in stage 141: Stage finished
19:48:40.668 INFO DAGScheduler - Job 100 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.062830 s
19:48:40.678 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:40.678 INFO DAGScheduler - Got job 101 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:40.678 INFO DAGScheduler - Final stage: ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185)
19:48:40.678 INFO DAGScheduler - Parents of final stage: List()
19:48:40.678 INFO DAGScheduler - Missing parents: List()
19:48:40.678 INFO DAGScheduler - Submitting ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:40.694 INFO MemoryStore - Block broadcast_266 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
19:48:40.696 INFO MemoryStore - Block broadcast_266_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
19:48:40.696 INFO BlockManagerInfo - Added broadcast_266_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.2 MiB)
19:48:40.696 INFO SparkContext - Created broadcast 266 from broadcast at DAGScheduler.scala:1580
19:48:40.696 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:40.696 INFO TaskSchedulerImpl - Adding task set 142.0 with 1 tasks resource profile 0
19:48:40.697 INFO TaskSetManager - Starting task 0.0 in stage 142.0 (TID 198) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:40.697 INFO Executor - Running task 0.0 in stage 142.0 (TID 198)
19:48:40.726 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:40.735 INFO Executor - Finished task 0.0 in stage 142.0 (TID 198). 989 bytes result sent to driver
19:48:40.736 INFO TaskSetManager - Finished task 0.0 in stage 142.0 (TID 198) in 40 ms on localhost (executor driver) (1/1)
19:48:40.736 INFO TaskSchedulerImpl - Removed TaskSet 142.0, whose tasks have all completed, from pool
19:48:40.736 INFO DAGScheduler - ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:48:40.736 INFO DAGScheduler - Job 101 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.736 INFO TaskSchedulerImpl - Killing all running tasks in stage 142: Stage finished
19:48:40.736 INFO DAGScheduler - Job 101 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058584 s
19:48:40.740 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:40.740 INFO DAGScheduler - Got job 102 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:40.740 INFO DAGScheduler - Final stage: ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185)
19:48:40.740 INFO DAGScheduler - Parents of final stage: List()
19:48:40.740 INFO DAGScheduler - Missing parents: List()
19:48:40.740 INFO DAGScheduler - Submitting ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:40.756 INFO MemoryStore - Block broadcast_267 stored as values in memory (estimated size 426.1 KiB, free 1915.2 MiB)
19:48:40.758 INFO MemoryStore - Block broadcast_267_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
19:48:40.758 INFO BlockManagerInfo - Added broadcast_267_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.0 MiB)
19:48:40.758 INFO SparkContext - Created broadcast 267 from broadcast at DAGScheduler.scala:1580
19:48:40.758 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:40.758 INFO TaskSchedulerImpl - Adding task set 143.0 with 1 tasks resource profile 0
19:48:40.759 INFO TaskSetManager - Starting task 0.0 in stage 143.0 (TID 199) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:40.759 INFO Executor - Running task 0.0 in stage 143.0 (TID 199)
19:48:40.789 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest11937960889066033099.bam:0+237038
19:48:40.801 INFO Executor - Finished task 0.0 in stage 143.0 (TID 199). 989 bytes result sent to driver
19:48:40.802 INFO TaskSetManager - Finished task 0.0 in stage 143.0 (TID 199) in 43 ms on localhost (executor driver) (1/1)
19:48:40.802 INFO TaskSchedulerImpl - Removed TaskSet 143.0, whose tasks have all completed, from pool
19:48:40.802 INFO DAGScheduler - ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.062 s
19:48:40.802 INFO DAGScheduler - Job 102 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:40.802 INFO TaskSchedulerImpl - Killing all running tasks in stage 143: Stage finished
19:48:40.802 INFO DAGScheduler - Job 102 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062461 s
19:48:40.804 INFO MemoryStore - Block broadcast_268 stored as values in memory (estimated size 298.0 KiB, free 1914.8 MiB)
19:48:40.811 INFO MemoryStore - Block broadcast_268_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1914.8 MiB)
19:48:40.811 INFO BlockManagerInfo - Added broadcast_268_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.0 MiB)
19:48:40.811 INFO SparkContext - Created broadcast 268 from newAPIHadoopFile at PathSplitSource.java:96
19:48:40.832 INFO MemoryStore - Block broadcast_269 stored as values in memory (estimated size 298.0 KiB, free 1914.5 MiB)
19:48:40.838 INFO MemoryStore - Block broadcast_269_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1914.4 MiB)
19:48:40.838 INFO BlockManagerInfo - Added broadcast_269_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1918.9 MiB)
19:48:40.838 INFO SparkContext - Created broadcast 269 from newAPIHadoopFile at PathSplitSource.java:96
19:48:40.858 INFO FileInputFormat - Total input files to process : 1
19:48:40.860 INFO MemoryStore - Block broadcast_270 stored as values in memory (estimated size 160.7 KiB, free 1914.3 MiB)
19:48:40.861 INFO MemoryStore - Block broadcast_270_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1914.3 MiB)
19:48:40.861 INFO BlockManagerInfo - Added broadcast_270_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1918.9 MiB)
19:48:40.861 INFO SparkContext - Created broadcast 270 from broadcast at ReadsSparkSink.java:133
19:48:40.862 INFO MemoryStore - Block broadcast_271 stored as values in memory (estimated size 163.2 KiB, free 1914.1 MiB)
19:48:40.869 INFO MemoryStore - Block broadcast_271_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1914.6 MiB)
19:48:40.869 INFO BlockManagerInfo - Removed broadcast_267_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.0 MiB)
19:48:40.869 INFO BlockManagerInfo - Added broadcast_271_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.0 MiB)
19:48:40.870 INFO SparkContext - Created broadcast 271 from broadcast at BamSink.java:76
19:48:40.870 INFO BlockManagerInfo - Removed broadcast_253_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.0 MiB)
19:48:40.871 INFO BlockManagerInfo - Removed broadcast_265_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.2 MiB)
19:48:40.871 INFO BlockManagerInfo - Removed broadcast_259_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:40.872 INFO BlockManagerInfo - Removed broadcast_262_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.4 MiB)
19:48:40.872 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.872 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.872 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.873 INFO BlockManagerInfo - Removed broadcast_269_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.4 MiB)
19:48:40.873 INFO BlockManagerInfo - Removed broadcast_247_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:40.874 INFO BlockManagerInfo - Removed broadcast_258_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:40.874 INFO BlockManagerInfo - Removed broadcast_264_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:40.875 INFO BlockManagerInfo - Removed broadcast_266_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.7 MiB)
19:48:40.875 INFO BlockManagerInfo - Removed broadcast_254_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:40.876 INFO BlockManagerInfo - Removed broadcast_263_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.9 MiB)
19:48:40.877 INFO BlockManagerInfo - Removed broadcast_260_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:40.877 INFO BlockManagerInfo - Removed broadcast_261_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:40.878 INFO BlockManagerInfo - Removed broadcast_257_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.9 MiB)
19:48:40.895 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:40.895 INFO DAGScheduler - Registering RDD 670 (mapToPair at SparkUtils.java:161) as input to shuffle 30
19:48:40.895 INFO DAGScheduler - Got job 103 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:40.895 INFO DAGScheduler - Final stage: ResultStage 145 (runJob at SparkHadoopWriter.scala:83)
19:48:40.895 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 144)
19:48:40.895 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 144)
19:48:40.896 INFO DAGScheduler - Submitting ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:40.913 INFO MemoryStore - Block broadcast_272 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
19:48:40.914 INFO MemoryStore - Block broadcast_272_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
19:48:40.914 INFO BlockManagerInfo - Added broadcast_272_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.8 MiB)
19:48:40.914 INFO SparkContext - Created broadcast 272 from broadcast at DAGScheduler.scala:1580
19:48:40.915 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:40.915 INFO TaskSchedulerImpl - Adding task set 144.0 with 1 tasks resource profile 0
19:48:40.915 INFO TaskSetManager - Starting task 0.0 in stage 144.0 (TID 200) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
19:48:40.915 INFO Executor - Running task 0.0 in stage 144.0 (TID 200)
19:48:40.946 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:40.964 INFO Executor - Finished task 0.0 in stage 144.0 (TID 200). 1148 bytes result sent to driver
19:48:40.964 INFO TaskSetManager - Finished task 0.0 in stage 144.0 (TID 200) in 49 ms on localhost (executor driver) (1/1)
19:48:40.964 INFO TaskSchedulerImpl - Removed TaskSet 144.0, whose tasks have all completed, from pool
19:48:40.964 INFO DAGScheduler - ShuffleMapStage 144 (mapToPair at SparkUtils.java:161) finished in 0.068 s
19:48:40.964 INFO DAGScheduler - looking for newly runnable stages
19:48:40.964 INFO DAGScheduler - running: HashSet()
19:48:40.964 INFO DAGScheduler - waiting: HashSet(ResultStage 145)
19:48:40.964 INFO DAGScheduler - failed: HashSet()
19:48:40.965 INFO DAGScheduler - Submitting ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91), which has no missing parents
19:48:40.976 INFO MemoryStore - Block broadcast_273 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
19:48:40.977 INFO MemoryStore - Block broadcast_273_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
19:48:40.977 INFO BlockManagerInfo - Added broadcast_273_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.7 MiB)
19:48:40.977 INFO SparkContext - Created broadcast 273 from broadcast at DAGScheduler.scala:1580
19:48:40.977 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:40.977 INFO TaskSchedulerImpl - Adding task set 145.0 with 1 tasks resource profile 0
19:48:40.978 INFO TaskSetManager - Starting task 0.0 in stage 145.0 (TID 201) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:40.978 INFO Executor - Running task 0.0 in stage 145.0 (TID 201)
19:48:40.982 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:40.982 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:40.993 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.993 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.993 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:40.993 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:40.993 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:40.993 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:41.017 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948402759125521393246862_0675_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest21872704602311069131.bam.parts/_temporary/0/task_202507151948402759125521393246862_0675_r_000000
19:48:41.017 INFO SparkHadoopMapRedUtil - attempt_202507151948402759125521393246862_0675_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:41.018 INFO Executor - Finished task 0.0 in stage 145.0 (TID 201). 1858 bytes result sent to driver
19:48:41.018 INFO TaskSetManager - Finished task 0.0 in stage 145.0 (TID 201) in 40 ms on localhost (executor driver) (1/1)
19:48:41.018 INFO TaskSchedulerImpl - Removed TaskSet 145.0, whose tasks have all completed, from pool
19:48:41.018 INFO DAGScheduler - ResultStage 145 (runJob at SparkHadoopWriter.scala:83) finished in 0.053 s
19:48:41.018 INFO DAGScheduler - Job 103 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.018 INFO TaskSchedulerImpl - Killing all running tasks in stage 145: Stage finished
19:48:41.019 INFO DAGScheduler - Job 103 finished: runJob at SparkHadoopWriter.scala:83, took 0.123805 s
19:48:41.019 INFO SparkHadoopWriter - Start to commit write Job job_202507151948402759125521393246862_0675.
19:48:41.025 INFO SparkHadoopWriter - Write Job job_202507151948402759125521393246862_0675 committed. Elapsed time: 5 ms.
19:48:41.036 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest21872704602311069131.bam
19:48:41.041 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest21872704602311069131.bam done
19:48:41.041 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest21872704602311069131.bam.parts/ to /tmp/ReadsSparkSinkUnitTest21872704602311069131.bam.sbi
19:48:41.045 INFO IndexFileMerger - Done merging .sbi files
19:48:41.045 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest21872704602311069131.bam.parts/ to /tmp/ReadsSparkSinkUnitTest21872704602311069131.bam.bai
19:48:41.051 INFO IndexFileMerger - Done merging .bai files
19:48:41.053 INFO MemoryStore - Block broadcast_274 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
19:48:41.053 INFO MemoryStore - Block broadcast_274_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
19:48:41.054 INFO BlockManagerInfo - Added broadcast_274_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:48:41.054 INFO SparkContext - Created broadcast 274 from broadcast at BamSource.java:104
19:48:41.055 INFO MemoryStore - Block broadcast_275 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:41.063 INFO MemoryStore - Block broadcast_275_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:41.064 INFO BlockManagerInfo - Added broadcast_275_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:41.064 INFO SparkContext - Created broadcast 275 from newAPIHadoopFile at PathSplitSource.java:96
19:48:41.078 INFO FileInputFormat - Total input files to process : 1
19:48:41.094 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:41.095 INFO DAGScheduler - Got job 104 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:41.095 INFO DAGScheduler - Final stage: ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:41.095 INFO DAGScheduler - Parents of final stage: List()
19:48:41.095 INFO DAGScheduler - Missing parents: List()
19:48:41.095 INFO DAGScheduler - Submitting ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.102 INFO MemoryStore - Block broadcast_276 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
19:48:41.103 INFO MemoryStore - Block broadcast_276_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
19:48:41.103 INFO BlockManagerInfo - Added broadcast_276_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:41.103 INFO SparkContext - Created broadcast 276 from broadcast at DAGScheduler.scala:1580
19:48:41.103 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.103 INFO TaskSchedulerImpl - Adding task set 146.0 with 1 tasks resource profile 0
19:48:41.104 INFO TaskSetManager - Starting task 0.0 in stage 146.0 (TID 202) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:41.104 INFO Executor - Running task 0.0 in stage 146.0 (TID 202)
19:48:41.115 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest21872704602311069131.bam:0+235514
19:48:41.119 INFO Executor - Finished task 0.0 in stage 146.0 (TID 202). 650141 bytes result sent to driver
19:48:41.121 INFO TaskSetManager - Finished task 0.0 in stage 146.0 (TID 202) in 17 ms on localhost (executor driver) (1/1)
19:48:41.121 INFO TaskSchedulerImpl - Removed TaskSet 146.0, whose tasks have all completed, from pool
19:48:41.121 INFO DAGScheduler - ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.026 s
19:48:41.121 INFO DAGScheduler - Job 104 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.121 INFO TaskSchedulerImpl - Killing all running tasks in stage 146: Stage finished
19:48:41.121 INFO DAGScheduler - Job 104 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026572 s
19:48:41.130 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:41.130 INFO DAGScheduler - Got job 105 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:41.130 INFO DAGScheduler - Final stage: ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185)
19:48:41.130 INFO DAGScheduler - Parents of final stage: List()
19:48:41.130 INFO DAGScheduler - Missing parents: List()
19:48:41.131 INFO DAGScheduler - Submitting ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.147 INFO MemoryStore - Block broadcast_277 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:41.148 INFO MemoryStore - Block broadcast_277_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:48:41.149 INFO BlockManagerInfo - Added broadcast_277_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:41.149 INFO SparkContext - Created broadcast 277 from broadcast at DAGScheduler.scala:1580
19:48:41.149 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.149 INFO TaskSchedulerImpl - Adding task set 147.0 with 1 tasks resource profile 0
19:48:41.149 INFO TaskSetManager - Starting task 0.0 in stage 147.0 (TID 203) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
19:48:41.150 INFO Executor - Running task 0.0 in stage 147.0 (TID 203)
19:48:41.178 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:41.191 INFO Executor - Finished task 0.0 in stage 147.0 (TID 203). 989 bytes result sent to driver
19:48:41.192 INFO TaskSetManager - Finished task 0.0 in stage 147.0 (TID 203) in 43 ms on localhost (executor driver) (1/1)
19:48:41.192 INFO TaskSchedulerImpl - Removed TaskSet 147.0, whose tasks have all completed, from pool
19:48:41.192 INFO DAGScheduler - ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.061 s
19:48:41.192 INFO DAGScheduler - Job 105 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.192 INFO TaskSchedulerImpl - Killing all running tasks in stage 147: Stage finished
19:48:41.192 INFO DAGScheduler - Job 105 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062033 s
19:48:41.195 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:41.196 INFO DAGScheduler - Got job 106 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:41.196 INFO DAGScheduler - Final stage: ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185)
19:48:41.196 INFO DAGScheduler - Parents of final stage: List()
19:48:41.196 INFO DAGScheduler - Missing parents: List()
19:48:41.196 INFO DAGScheduler - Submitting ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.202 INFO MemoryStore - Block broadcast_278 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:48:41.203 INFO MemoryStore - Block broadcast_278_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
19:48:41.203 INFO BlockManagerInfo - Added broadcast_278_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.4 MiB)
19:48:41.203 INFO SparkContext - Created broadcast 278 from broadcast at DAGScheduler.scala:1580
19:48:41.203 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.203 INFO TaskSchedulerImpl - Adding task set 148.0 with 1 tasks resource profile 0
19:48:41.204 INFO TaskSetManager - Starting task 0.0 in stage 148.0 (TID 204) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:41.204 INFO Executor - Running task 0.0 in stage 148.0 (TID 204)
19:48:41.215 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest21872704602311069131.bam:0+235514
19:48:41.218 INFO Executor - Finished task 0.0 in stage 148.0 (TID 204). 989 bytes result sent to driver
19:48:41.218 INFO TaskSetManager - Finished task 0.0 in stage 148.0 (TID 204) in 14 ms on localhost (executor driver) (1/1)
19:48:41.218 INFO TaskSchedulerImpl - Removed TaskSet 148.0, whose tasks have all completed, from pool
19:48:41.218 INFO DAGScheduler - ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
19:48:41.218 INFO DAGScheduler - Job 106 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.218 INFO TaskSchedulerImpl - Killing all running tasks in stage 148: Stage finished
19:48:41.218 INFO DAGScheduler - Job 106 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.023027 s
19:48:41.221 INFO MemoryStore - Block broadcast_279 stored as values in memory (estimated size 298.0 KiB, free 1916.8 MiB)
19:48:41.227 INFO MemoryStore - Block broadcast_279_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:48:41.227 INFO BlockManagerInfo - Added broadcast_279_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:41.227 INFO SparkContext - Created broadcast 279 from newAPIHadoopFile at PathSplitSource.java:96
19:48:41.248 INFO MemoryStore - Block broadcast_280 stored as values in memory (estimated size 298.0 KiB, free 1916.4 MiB)
19:48:41.253 INFO MemoryStore - Block broadcast_280_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:41.254 INFO BlockManagerInfo - Added broadcast_280_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:41.254 INFO SparkContext - Created broadcast 280 from newAPIHadoopFile at PathSplitSource.java:96
19:48:41.273 INFO FileInputFormat - Total input files to process : 1
19:48:41.274 INFO MemoryStore - Block broadcast_281 stored as values in memory (estimated size 19.6 KiB, free 1916.4 MiB)
19:48:41.274 INFO MemoryStore - Block broadcast_281_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
19:48:41.275 INFO BlockManagerInfo - Added broadcast_281_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.3 MiB)
19:48:41.275 INFO SparkContext - Created broadcast 281 from broadcast at ReadsSparkSink.java:133
19:48:41.275 INFO MemoryStore - Block broadcast_282 stored as values in memory (estimated size 20.0 KiB, free 1916.3 MiB)
19:48:41.276 INFO MemoryStore - Block broadcast_282_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
19:48:41.276 INFO BlockManagerInfo - Added broadcast_282_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.3 MiB)
19:48:41.276 INFO SparkContext - Created broadcast 282 from broadcast at BamSink.java:76
19:48:41.277 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:41.278 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:41.278 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:41.294 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:41.294 INFO DAGScheduler - Registering RDD 695 (mapToPair at SparkUtils.java:161) as input to shuffle 31
19:48:41.294 INFO DAGScheduler - Got job 107 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:41.294 INFO DAGScheduler - Final stage: ResultStage 150 (runJob at SparkHadoopWriter.scala:83)
19:48:41.294 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 149)
19:48:41.294 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 149)
19:48:41.295 INFO DAGScheduler - Submitting ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:41.312 INFO MemoryStore - Block broadcast_283 stored as values in memory (estimated size 434.3 KiB, free 1915.9 MiB)
19:48:41.313 INFO MemoryStore - Block broadcast_283_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1915.8 MiB)
19:48:41.313 INFO BlockManagerInfo - Added broadcast_283_piece0 in memory on localhost:36125 (size: 157.6 KiB, free: 1919.1 MiB)
19:48:41.314 INFO SparkContext - Created broadcast 283 from broadcast at DAGScheduler.scala:1580
19:48:41.314 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:41.314 INFO TaskSchedulerImpl - Adding task set 149.0 with 1 tasks resource profile 0
19:48:41.314 INFO TaskSetManager - Starting task 0.0 in stage 149.0 (TID 205) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
19:48:41.315 INFO Executor - Running task 0.0 in stage 149.0 (TID 205)
19:48:41.344 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:41.361 INFO Executor - Finished task 0.0 in stage 149.0 (TID 205). 1234 bytes result sent to driver
19:48:41.362 INFO BlockManagerInfo - Removed broadcast_280_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:41.362 INFO TaskSetManager - Finished task 0.0 in stage 149.0 (TID 205) in 48 ms on localhost (executor driver) (1/1)
19:48:41.362 INFO TaskSchedulerImpl - Removed TaskSet 149.0, whose tasks have all completed, from pool
19:48:41.362 INFO BlockManagerInfo - Removed broadcast_276_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.2 MiB)
19:48:41.362 INFO DAGScheduler - ShuffleMapStage 149 (mapToPair at SparkUtils.java:161) finished in 0.067 s
19:48:41.362 INFO DAGScheduler - looking for newly runnable stages
19:48:41.362 INFO DAGScheduler - running: HashSet()
19:48:41.362 INFO DAGScheduler - waiting: HashSet(ResultStage 150)
19:48:41.362 INFO DAGScheduler - failed: HashSet()
19:48:41.363 INFO DAGScheduler - Submitting ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91), which has no missing parents
19:48:41.363 INFO BlockManagerInfo - Removed broadcast_268_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.3 MiB)
19:48:41.364 INFO BlockManagerInfo - Removed broadcast_273_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.4 MiB)
19:48:41.364 INFO BlockManagerInfo - Removed broadcast_271_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:41.365 INFO BlockManagerInfo - Removed broadcast_277_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:41.365 INFO BlockManagerInfo - Removed broadcast_278_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.6 MiB)
19:48:41.366 INFO BlockManagerInfo - Removed broadcast_275_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:41.366 INFO BlockManagerInfo - Removed broadcast_270_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:41.367 INFO BlockManagerInfo - Removed broadcast_272_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:41.368 INFO BlockManagerInfo - Removed broadcast_274_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.8 MiB)
19:48:41.371 INFO MemoryStore - Block broadcast_284 stored as values in memory (estimated size 155.3 KiB, free 1918.9 MiB)
19:48:41.372 INFO MemoryStore - Block broadcast_284_piece0 stored as bytes in memory (estimated size 58.4 KiB, free 1918.8 MiB)
19:48:41.372 INFO BlockManagerInfo - Added broadcast_284_piece0 in memory on localhost:36125 (size: 58.4 KiB, free: 1919.7 MiB)
19:48:41.372 INFO SparkContext - Created broadcast 284 from broadcast at DAGScheduler.scala:1580
19:48:41.373 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:41.373 INFO TaskSchedulerImpl - Adding task set 150.0 with 1 tasks resource profile 0
19:48:41.373 INFO TaskSetManager - Starting task 0.0 in stage 150.0 (TID 206) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:41.373 INFO Executor - Running task 0.0 in stage 150.0 (TID 206)
19:48:41.377 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:41.377 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:41.388 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:41.388 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:41.388 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:41.388 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:41.388 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:41.388 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:41.410 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948415483147384916048611_0700_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest37590291106431008143.bam.parts/_temporary/0/task_202507151948415483147384916048611_0700_r_000000
19:48:41.410 INFO SparkHadoopMapRedUtil - attempt_202507151948415483147384916048611_0700_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:41.411 INFO Executor - Finished task 0.0 in stage 150.0 (TID 206). 1858 bytes result sent to driver
19:48:41.411 INFO TaskSetManager - Finished task 0.0 in stage 150.0 (TID 206) in 38 ms on localhost (executor driver) (1/1)
19:48:41.411 INFO TaskSchedulerImpl - Removed TaskSet 150.0, whose tasks have all completed, from pool
19:48:41.411 INFO DAGScheduler - ResultStage 150 (runJob at SparkHadoopWriter.scala:83) finished in 0.048 s
19:48:41.411 INFO DAGScheduler - Job 107 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.411 INFO TaskSchedulerImpl - Killing all running tasks in stage 150: Stage finished
19:48:41.411 INFO DAGScheduler - Job 107 finished: runJob at SparkHadoopWriter.scala:83, took 0.117741 s
19:48:41.412 INFO SparkHadoopWriter - Start to commit write Job job_202507151948415483147384916048611_0700.
19:48:41.416 INFO SparkHadoopWriter - Write Job job_202507151948415483147384916048611_0700 committed. Elapsed time: 4 ms.
19:48:41.427 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest37590291106431008143.bam
19:48:41.431 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest37590291106431008143.bam done
19:48:41.431 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest37590291106431008143.bam.parts/ to /tmp/ReadsSparkSinkUnitTest37590291106431008143.bam.sbi
19:48:41.436 INFO IndexFileMerger - Done merging .sbi files
19:48:41.436 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest37590291106431008143.bam.parts/ to /tmp/ReadsSparkSinkUnitTest37590291106431008143.bam.bai
19:48:41.441 INFO IndexFileMerger - Done merging .bai files
19:48:41.442 INFO MemoryStore - Block broadcast_285 stored as values in memory (estimated size 312.0 B, free 1918.8 MiB)
19:48:41.443 INFO MemoryStore - Block broadcast_285_piece0 stored as bytes in memory (estimated size 231.0 B, free 1918.8 MiB)
19:48:41.443 INFO BlockManagerInfo - Added broadcast_285_piece0 in memory on localhost:36125 (size: 231.0 B, free: 1919.7 MiB)
19:48:41.443 INFO SparkContext - Created broadcast 285 from broadcast at BamSource.java:104
19:48:41.444 INFO MemoryStore - Block broadcast_286 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
19:48:41.450 INFO MemoryStore - Block broadcast_286_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.5 MiB)
19:48:41.450 INFO BlockManagerInfo - Added broadcast_286_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:41.451 INFO SparkContext - Created broadcast 286 from newAPIHadoopFile at PathSplitSource.java:96
19:48:41.459 INFO FileInputFormat - Total input files to process : 1
19:48:41.473 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:41.473 INFO DAGScheduler - Got job 108 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:41.473 INFO DAGScheduler - Final stage: ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:41.473 INFO DAGScheduler - Parents of final stage: List()
19:48:41.473 INFO DAGScheduler - Missing parents: List()
19:48:41.473 INFO DAGScheduler - Submitting ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.479 INFO MemoryStore - Block broadcast_287 stored as values in memory (estimated size 148.2 KiB, free 1918.3 MiB)
19:48:41.480 INFO MemoryStore - Block broadcast_287_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.3 MiB)
19:48:41.480 INFO BlockManagerInfo - Added broadcast_287_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:41.480 INFO SparkContext - Created broadcast 287 from broadcast at DAGScheduler.scala:1580
19:48:41.480 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.480 INFO TaskSchedulerImpl - Adding task set 151.0 with 1 tasks resource profile 0
19:48:41.481 INFO TaskSetManager - Starting task 0.0 in stage 151.0 (TID 207) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:41.481 INFO Executor - Running task 0.0 in stage 151.0 (TID 207)
19:48:41.494 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest37590291106431008143.bam:0+236517
19:48:41.498 INFO Executor - Finished task 0.0 in stage 151.0 (TID 207). 749470 bytes result sent to driver
19:48:41.500 INFO TaskSetManager - Finished task 0.0 in stage 151.0 (TID 207) in 19 ms on localhost (executor driver) (1/1)
19:48:41.500 INFO TaskSchedulerImpl - Removed TaskSet 151.0, whose tasks have all completed, from pool
19:48:41.500 INFO DAGScheduler - ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
19:48:41.500 INFO DAGScheduler - Job 108 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.500 INFO TaskSchedulerImpl - Killing all running tasks in stage 151: Stage finished
19:48:41.500 INFO DAGScheduler - Job 108 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.027376 s
19:48:41.511 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:41.511 INFO DAGScheduler - Got job 109 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:41.511 INFO DAGScheduler - Final stage: ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185)
19:48:41.511 INFO DAGScheduler - Parents of final stage: List()
19:48:41.511 INFO DAGScheduler - Missing parents: List()
19:48:41.511 INFO DAGScheduler - Submitting ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.528 INFO MemoryStore - Block broadcast_288 stored as values in memory (estimated size 426.1 KiB, free 1917.9 MiB)
19:48:41.529 INFO MemoryStore - Block broadcast_288_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.7 MiB)
19:48:41.529 INFO BlockManagerInfo - Added broadcast_288_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:41.529 INFO SparkContext - Created broadcast 288 from broadcast at DAGScheduler.scala:1580
19:48:41.529 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.529 INFO TaskSchedulerImpl - Adding task set 152.0 with 1 tasks resource profile 0
19:48:41.530 INFO TaskSetManager - Starting task 0.0 in stage 152.0 (TID 208) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
19:48:41.530 INFO Executor - Running task 0.0 in stage 152.0 (TID 208)
19:48:41.559 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:41.566 INFO Executor - Finished task 0.0 in stage 152.0 (TID 208). 989 bytes result sent to driver
19:48:41.566 INFO TaskSetManager - Finished task 0.0 in stage 152.0 (TID 208) in 36 ms on localhost (executor driver) (1/1)
19:48:41.567 INFO TaskSchedulerImpl - Removed TaskSet 152.0, whose tasks have all completed, from pool
19:48:41.567 INFO DAGScheduler - ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.056 s
19:48:41.567 INFO DAGScheduler - Job 109 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.567 INFO TaskSchedulerImpl - Killing all running tasks in stage 152: Stage finished
19:48:41.567 INFO DAGScheduler - Job 109 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.056321 s
19:48:41.570 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:41.570 INFO DAGScheduler - Got job 110 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:41.570 INFO DAGScheduler - Final stage: ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185)
19:48:41.570 INFO DAGScheduler - Parents of final stage: List()
19:48:41.570 INFO DAGScheduler - Missing parents: List()
19:48:41.570 INFO DAGScheduler - Submitting ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.576 INFO MemoryStore - Block broadcast_289 stored as values in memory (estimated size 148.1 KiB, free 1917.6 MiB)
19:48:41.577 INFO MemoryStore - Block broadcast_289_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.5 MiB)
19:48:41.577 INFO BlockManagerInfo - Added broadcast_289_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.4 MiB)
19:48:41.577 INFO SparkContext - Created broadcast 289 from broadcast at DAGScheduler.scala:1580
19:48:41.577 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.578 INFO TaskSchedulerImpl - Adding task set 153.0 with 1 tasks resource profile 0
19:48:41.578 INFO TaskSetManager - Starting task 0.0 in stage 153.0 (TID 209) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:41.578 INFO Executor - Running task 0.0 in stage 153.0 (TID 209)
19:48:41.589 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest37590291106431008143.bam:0+236517
19:48:41.592 INFO Executor - Finished task 0.0 in stage 153.0 (TID 209). 989 bytes result sent to driver
19:48:41.592 INFO TaskSetManager - Finished task 0.0 in stage 153.0 (TID 209) in 14 ms on localhost (executor driver) (1/1)
19:48:41.592 INFO TaskSchedulerImpl - Removed TaskSet 153.0, whose tasks have all completed, from pool
19:48:41.592 INFO DAGScheduler - ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.021 s
19:48:41.592 INFO DAGScheduler - Job 110 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.592 INFO TaskSchedulerImpl - Killing all running tasks in stage 153: Stage finished
19:48:41.593 INFO DAGScheduler - Job 110 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022380 s
19:48:41.595 INFO MemoryStore - Block broadcast_290 stored as values in memory (estimated size 576.0 B, free 1917.5 MiB)
19:48:41.595 INFO MemoryStore - Block broadcast_290_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.5 MiB)
19:48:41.595 INFO BlockManagerInfo - Added broadcast_290_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.4 MiB)
19:48:41.595 INFO SparkContext - Created broadcast 290 from broadcast at CramSource.java:114
19:48:41.596 INFO MemoryStore - Block broadcast_291 stored as values in memory (estimated size 297.9 KiB, free 1917.2 MiB)
19:48:41.602 INFO MemoryStore - Block broadcast_291_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.2 MiB)
19:48:41.602 INFO BlockManagerInfo - Added broadcast_291_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:41.602 INFO SparkContext - Created broadcast 291 from newAPIHadoopFile at PathSplitSource.java:96
19:48:41.617 INFO MemoryStore - Block broadcast_292 stored as values in memory (estimated size 576.0 B, free 1917.2 MiB)
19:48:41.617 INFO MemoryStore - Block broadcast_292_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.2 MiB)
19:48:41.617 INFO BlockManagerInfo - Added broadcast_292_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.4 MiB)
19:48:41.618 INFO SparkContext - Created broadcast 292 from broadcast at CramSource.java:114
19:48:41.618 INFO MemoryStore - Block broadcast_293 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
19:48:41.624 INFO MemoryStore - Block broadcast_293_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
19:48:41.624 INFO BlockManagerInfo - Added broadcast_293_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:41.624 INFO SparkContext - Created broadcast 293 from newAPIHadoopFile at PathSplitSource.java:96
19:48:41.638 INFO FileInputFormat - Total input files to process : 1
19:48:41.639 INFO MemoryStore - Block broadcast_294 stored as values in memory (estimated size 6.0 KiB, free 1916.8 MiB)
19:48:41.639 INFO MemoryStore - Block broadcast_294_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.8 MiB)
19:48:41.639 INFO BlockManagerInfo - Added broadcast_294_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.3 MiB)
19:48:41.639 INFO SparkContext - Created broadcast 294 from broadcast at ReadsSparkSink.java:133
19:48:41.640 INFO MemoryStore - Block broadcast_295 stored as values in memory (estimated size 6.2 KiB, free 1916.8 MiB)
19:48:41.640 INFO MemoryStore - Block broadcast_295_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.8 MiB)
19:48:41.640 INFO BlockManagerInfo - Added broadcast_295_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.3 MiB)
19:48:41.641 INFO SparkContext - Created broadcast 295 from broadcast at CramSink.java:76
19:48:41.642 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:41.642 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:41.642 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:41.658 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:41.659 INFO DAGScheduler - Registering RDD 718 (mapToPair at SparkUtils.java:161) as input to shuffle 32
19:48:41.659 INFO DAGScheduler - Got job 111 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:41.659 INFO DAGScheduler - Final stage: ResultStage 155 (runJob at SparkHadoopWriter.scala:83)
19:48:41.659 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 154)
19:48:41.659 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 154)
19:48:41.659 INFO DAGScheduler - Submitting ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:41.671 INFO MemoryStore - Block broadcast_296 stored as values in memory (estimated size 292.8 KiB, free 1916.5 MiB)
19:48:41.676 INFO BlockManagerInfo - Removed broadcast_289_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.4 MiB)
19:48:41.677 INFO BlockManagerInfo - Removed broadcast_293_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:41.677 INFO MemoryStore - Block broadcast_296_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.0 MiB)
19:48:41.677 INFO BlockManagerInfo - Added broadcast_296_piece0 in memory on localhost:36125 (size: 107.3 KiB, free: 1919.3 MiB)
19:48:41.677 INFO SparkContext - Created broadcast 296 from broadcast at DAGScheduler.scala:1580
19:48:41.677 INFO BlockManagerInfo - Removed broadcast_279_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:41.677 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:41.677 INFO TaskSchedulerImpl - Adding task set 154.0 with 1 tasks resource profile 0
19:48:41.678 INFO TaskSetManager - Starting task 0.0 in stage 154.0 (TID 210) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
19:48:41.678 INFO BlockManagerInfo - Removed broadcast_286_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:41.679 INFO Executor - Running task 0.0 in stage 154.0 (TID 210)
19:48:41.679 INFO BlockManagerInfo - Removed broadcast_284_piece0 on localhost:36125 in memory (size: 58.4 KiB, free: 1919.5 MiB)
19:48:41.679 INFO BlockManagerInfo - Removed broadcast_281_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.5 MiB)
19:48:41.680 INFO BlockManagerInfo - Removed broadcast_283_piece0 on localhost:36125 in memory (size: 157.6 KiB, free: 1919.6 MiB)
19:48:41.681 INFO BlockManagerInfo - Removed broadcast_288_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:41.681 INFO BlockManagerInfo - Removed broadcast_285_piece0 on localhost:36125 in memory (size: 231.0 B, free: 1919.8 MiB)
19:48:41.682 INFO BlockManagerInfo - Removed broadcast_282_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.8 MiB)
19:48:41.682 INFO BlockManagerInfo - Removed broadcast_287_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.8 MiB)
19:48:41.683 INFO BlockManagerInfo - Removed broadcast_292_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.8 MiB)
19:48:41.707 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:41.720 INFO Executor - Finished task 0.0 in stage 154.0 (TID 210). 1148 bytes result sent to driver
19:48:41.720 INFO TaskSetManager - Finished task 0.0 in stage 154.0 (TID 210) in 42 ms on localhost (executor driver) (1/1)
19:48:41.720 INFO TaskSchedulerImpl - Removed TaskSet 154.0, whose tasks have all completed, from pool
19:48:41.720 INFO DAGScheduler - ShuffleMapStage 154 (mapToPair at SparkUtils.java:161) finished in 0.061 s
19:48:41.720 INFO DAGScheduler - looking for newly runnable stages
19:48:41.720 INFO DAGScheduler - running: HashSet()
19:48:41.720 INFO DAGScheduler - waiting: HashSet(ResultStage 155)
19:48:41.720 INFO DAGScheduler - failed: HashSet()
19:48:41.721 INFO DAGScheduler - Submitting ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89), which has no missing parents
19:48:41.727 INFO MemoryStore - Block broadcast_297 stored as values in memory (estimated size 153.2 KiB, free 1919.1 MiB)
19:48:41.728 INFO MemoryStore - Block broadcast_297_piece0 stored as bytes in memory (estimated size 58.0 KiB, free 1919.0 MiB)
19:48:41.728 INFO BlockManagerInfo - Added broadcast_297_piece0 in memory on localhost:36125 (size: 58.0 KiB, free: 1919.8 MiB)
19:48:41.728 INFO SparkContext - Created broadcast 297 from broadcast at DAGScheduler.scala:1580
19:48:41.728 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
19:48:41.728 INFO TaskSchedulerImpl - Adding task set 155.0 with 1 tasks resource profile 0
19:48:41.729 INFO TaskSetManager - Starting task 0.0 in stage 155.0 (TID 211) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:41.729 INFO Executor - Running task 0.0 in stage 155.0 (TID 211)
19:48:41.733 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:41.733 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:41.739 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:41.739 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:41.739 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:41.739 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:41.739 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:41.739 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:41.795 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194841240460685982189794_0723_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest56949542370689638160.cram.parts/_temporary/0/task_20250715194841240460685982189794_0723_r_000000
19:48:41.795 INFO SparkHadoopMapRedUtil - attempt_20250715194841240460685982189794_0723_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:41.796 INFO Executor - Finished task 0.0 in stage 155.0 (TID 211). 1858 bytes result sent to driver
19:48:41.796 INFO TaskSetManager - Finished task 0.0 in stage 155.0 (TID 211) in 67 ms on localhost (executor driver) (1/1)
19:48:41.796 INFO TaskSchedulerImpl - Removed TaskSet 155.0, whose tasks have all completed, from pool
19:48:41.796 INFO DAGScheduler - ResultStage 155 (runJob at SparkHadoopWriter.scala:83) finished in 0.075 s
19:48:41.796 INFO DAGScheduler - Job 111 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.796 INFO TaskSchedulerImpl - Killing all running tasks in stage 155: Stage finished
19:48:41.796 INFO DAGScheduler - Job 111 finished: runJob at SparkHadoopWriter.scala:83, took 0.138150 s
19:48:41.797 INFO SparkHadoopWriter - Start to commit write Job job_20250715194841240460685982189794_0723.
19:48:41.802 INFO SparkHadoopWriter - Write Job job_20250715194841240460685982189794_0723 committed. Elapsed time: 5 ms.
19:48:41.814 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest56949542370689638160.cram
19:48:41.819 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest56949542370689638160.cram done
19:48:41.821 INFO MemoryStore - Block broadcast_298 stored as values in memory (estimated size 504.0 B, free 1919.0 MiB)
19:48:41.821 INFO MemoryStore - Block broadcast_298_piece0 stored as bytes in memory (estimated size 159.0 B, free 1919.0 MiB)
19:48:41.821 INFO BlockManagerInfo - Added broadcast_298_piece0 in memory on localhost:36125 (size: 159.0 B, free: 1919.8 MiB)
19:48:41.822 INFO SparkContext - Created broadcast 298 from broadcast at CramSource.java:114
19:48:41.823 INFO MemoryStore - Block broadcast_299 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
19:48:41.833 INFO MemoryStore - Block broadcast_299_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
19:48:41.834 INFO BlockManagerInfo - Added broadcast_299_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:41.834 INFO SparkContext - Created broadcast 299 from newAPIHadoopFile at PathSplitSource.java:96
19:48:41.853 INFO FileInputFormat - Total input files to process : 1
19:48:41.878 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:41.878 INFO DAGScheduler - Got job 112 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:41.878 INFO DAGScheduler - Final stage: ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:41.878 INFO DAGScheduler - Parents of final stage: List()
19:48:41.878 INFO DAGScheduler - Missing parents: List()
19:48:41.878 INFO DAGScheduler - Submitting ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.889 INFO MemoryStore - Block broadcast_300 stored as values in memory (estimated size 286.8 KiB, free 1918.4 MiB)
19:48:41.890 INFO MemoryStore - Block broadcast_300_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.3 MiB)
19:48:41.891 INFO BlockManagerInfo - Added broadcast_300_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.6 MiB)
19:48:41.891 INFO SparkContext - Created broadcast 300 from broadcast at DAGScheduler.scala:1580
19:48:41.891 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.891 INFO TaskSchedulerImpl - Adding task set 156.0 with 1 tasks resource profile 0
19:48:41.891 INFO TaskSetManager - Starting task 0.0 in stage 156.0 (TID 212) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:41.892 INFO Executor - Running task 0.0 in stage 156.0 (TID 212)
19:48:41.911 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest56949542370689638160.cram:0+43713
19:48:41.932 INFO Executor - Finished task 0.0 in stage 156.0 (TID 212). 154101 bytes result sent to driver
19:48:41.933 INFO TaskSetManager - Finished task 0.0 in stage 156.0 (TID 212) in 42 ms on localhost (executor driver) (1/1)
19:48:41.933 INFO TaskSchedulerImpl - Removed TaskSet 156.0, whose tasks have all completed, from pool
19:48:41.933 INFO DAGScheduler - ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.054 s
19:48:41.933 INFO DAGScheduler - Job 112 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.933 INFO TaskSchedulerImpl - Killing all running tasks in stage 156: Stage finished
19:48:41.933 INFO DAGScheduler - Job 112 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.055277 s
19:48:41.938 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:41.938 INFO DAGScheduler - Got job 113 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:41.938 INFO DAGScheduler - Final stage: ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185)
19:48:41.938 INFO DAGScheduler - Parents of final stage: List()
19:48:41.938 INFO DAGScheduler - Missing parents: List()
19:48:41.939 INFO DAGScheduler - Submitting ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.950 INFO MemoryStore - Block broadcast_301 stored as values in memory (estimated size 286.8 KiB, free 1918.0 MiB)
19:48:41.951 INFO MemoryStore - Block broadcast_301_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.9 MiB)
19:48:41.951 INFO BlockManagerInfo - Added broadcast_301_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.5 MiB)
19:48:41.952 INFO SparkContext - Created broadcast 301 from broadcast at DAGScheduler.scala:1580
19:48:41.952 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.952 INFO TaskSchedulerImpl - Adding task set 157.0 with 1 tasks resource profile 0
19:48:41.952 INFO TaskSetManager - Starting task 0.0 in stage 157.0 (TID 213) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
19:48:41.952 INFO Executor - Running task 0.0 in stage 157.0 (TID 213)
19:48:41.973 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:41.979 INFO Executor - Finished task 0.0 in stage 157.0 (TID 213). 989 bytes result sent to driver
19:48:41.979 INFO TaskSetManager - Finished task 0.0 in stage 157.0 (TID 213) in 27 ms on localhost (executor driver) (1/1)
19:48:41.979 INFO TaskSchedulerImpl - Removed TaskSet 157.0, whose tasks have all completed, from pool
19:48:41.979 INFO DAGScheduler - ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.040 s
19:48:41.980 INFO DAGScheduler - Job 113 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:41.980 INFO TaskSchedulerImpl - Killing all running tasks in stage 157: Stage finished
19:48:41.980 INFO DAGScheduler - Job 113 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.041414 s
19:48:41.983 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:41.983 INFO DAGScheduler - Got job 114 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:41.983 INFO DAGScheduler - Final stage: ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185)
19:48:41.983 INFO DAGScheduler - Parents of final stage: List()
19:48:41.983 INFO DAGScheduler - Missing parents: List()
19:48:41.983 INFO DAGScheduler - Submitting ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:41.995 INFO MemoryStore - Block broadcast_302 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
19:48:41.996 INFO MemoryStore - Block broadcast_302_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
19:48:41.996 INFO BlockManagerInfo - Added broadcast_302_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.4 MiB)
19:48:41.996 INFO SparkContext - Created broadcast 302 from broadcast at DAGScheduler.scala:1580
19:48:41.996 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:41.996 INFO TaskSchedulerImpl - Adding task set 158.0 with 1 tasks resource profile 0
19:48:41.997 INFO TaskSetManager - Starting task 0.0 in stage 158.0 (TID 214) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:41.997 INFO Executor - Running task 0.0 in stage 158.0 (TID 214)
19:48:42.017 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest56949542370689638160.cram:0+43713
19:48:42.029 INFO Executor - Finished task 0.0 in stage 158.0 (TID 214). 989 bytes result sent to driver
19:48:42.029 INFO TaskSetManager - Finished task 0.0 in stage 158.0 (TID 214) in 32 ms on localhost (executor driver) (1/1)
19:48:42.029 INFO TaskSchedulerImpl - Removed TaskSet 158.0, whose tasks have all completed, from pool
19:48:42.029 INFO DAGScheduler - ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.045 s
19:48:42.029 INFO DAGScheduler - Job 114 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.030 INFO TaskSchedulerImpl - Killing all running tasks in stage 158: Stage finished
19:48:42.030 INFO DAGScheduler - Job 114 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.046584 s
19:48:42.032 INFO MemoryStore - Block broadcast_303 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
19:48:42.038 INFO MemoryStore - Block broadcast_303_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.2 MiB)
19:48:42.038 INFO BlockManagerInfo - Added broadcast_303_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:42.039 INFO SparkContext - Created broadcast 303 from newAPIHadoopFile at PathSplitSource.java:96
19:48:42.060 INFO MemoryStore - Block broadcast_304 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
19:48:42.066 INFO MemoryStore - Block broadcast_304_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
19:48:42.066 INFO BlockManagerInfo - Added broadcast_304_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:42.066 INFO SparkContext - Created broadcast 304 from newAPIHadoopFile at PathSplitSource.java:96
19:48:42.086 INFO FileInputFormat - Total input files to process : 1
19:48:42.088 INFO MemoryStore - Block broadcast_305 stored as values in memory (estimated size 160.7 KiB, free 1916.7 MiB)
19:48:42.088 INFO MemoryStore - Block broadcast_305_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.7 MiB)
19:48:42.089 INFO BlockManagerInfo - Added broadcast_305_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:42.089 INFO SparkContext - Created broadcast 305 from broadcast at ReadsSparkSink.java:133
19:48:42.094 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:42.094 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.094 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.115 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:42.115 INFO DAGScheduler - Registering RDD 743 (mapToPair at SparkUtils.java:161) as input to shuffle 33
19:48:42.115 INFO DAGScheduler - Got job 115 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:42.115 INFO DAGScheduler - Final stage: ResultStage 160 (runJob at SparkHadoopWriter.scala:83)
19:48:42.115 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 159)
19:48:42.115 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 159)
19:48:42.115 INFO DAGScheduler - Submitting ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:42.132 INFO MemoryStore - Block broadcast_306 stored as values in memory (estimated size 520.4 KiB, free 1916.2 MiB)
19:48:42.134 INFO MemoryStore - Block broadcast_306_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
19:48:42.134 INFO BlockManagerInfo - Added broadcast_306_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:42.134 INFO SparkContext - Created broadcast 306 from broadcast at DAGScheduler.scala:1580
19:48:42.134 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:42.134 INFO TaskSchedulerImpl - Adding task set 159.0 with 1 tasks resource profile 0
19:48:42.135 INFO TaskSetManager - Starting task 0.0 in stage 159.0 (TID 215) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:42.135 INFO Executor - Running task 0.0 in stage 159.0 (TID 215)
19:48:42.164 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:42.179 INFO Executor - Finished task 0.0 in stage 159.0 (TID 215). 1148 bytes result sent to driver
19:48:42.180 INFO TaskSetManager - Finished task 0.0 in stage 159.0 (TID 215) in 44 ms on localhost (executor driver) (1/1)
19:48:42.180 INFO TaskSchedulerImpl - Removed TaskSet 159.0, whose tasks have all completed, from pool
19:48:42.180 INFO DAGScheduler - ShuffleMapStage 159 (mapToPair at SparkUtils.java:161) finished in 0.064 s
19:48:42.180 INFO DAGScheduler - looking for newly runnable stages
19:48:42.180 INFO DAGScheduler - running: HashSet()
19:48:42.180 INFO DAGScheduler - waiting: HashSet(ResultStage 160)
19:48:42.180 INFO DAGScheduler - failed: HashSet()
19:48:42.180 INFO DAGScheduler - Submitting ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65), which has no missing parents
19:48:42.187 INFO MemoryStore - Block broadcast_307 stored as values in memory (estimated size 241.1 KiB, free 1915.8 MiB)
19:48:42.192 INFO BlockManagerInfo - Removed broadcast_304_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:42.192 INFO MemoryStore - Block broadcast_307_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1916.1 MiB)
19:48:42.192 INFO BlockManagerInfo - Added broadcast_307_piece0 in memory on localhost:36125 (size: 66.9 KiB, free: 1919.1 MiB)
19:48:42.192 INFO BlockManagerInfo - Removed broadcast_302_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.2 MiB)
19:48:42.192 INFO SparkContext - Created broadcast 307 from broadcast at DAGScheduler.scala:1580
19:48:42.192 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
19:48:42.192 INFO TaskSchedulerImpl - Adding task set 160.0 with 1 tasks resource profile 0
19:48:42.192 INFO BlockManagerInfo - Removed broadcast_296_piece0 on localhost:36125 in memory (size: 107.3 KiB, free: 1919.4 MiB)
19:48:42.193 INFO BlockManagerInfo - Removed broadcast_291_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:42.193 INFO TaskSetManager - Starting task 0.0 in stage 160.0 (TID 216) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:42.193 INFO BlockManagerInfo - Removed broadcast_298_piece0 on localhost:36125 in memory (size: 159.0 B, free: 1919.4 MiB)
19:48:42.193 INFO Executor - Running task 0.0 in stage 160.0 (TID 216)
19:48:42.194 INFO BlockManagerInfo - Removed broadcast_299_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:42.195 INFO BlockManagerInfo - Removed broadcast_295_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.5 MiB)
19:48:42.195 INFO BlockManagerInfo - Removed broadcast_297_piece0 on localhost:36125 in memory (size: 58.0 KiB, free: 1919.5 MiB)
19:48:42.195 INFO BlockManagerInfo - Removed broadcast_290_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.5 MiB)
19:48:42.196 INFO BlockManagerInfo - Removed broadcast_294_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.5 MiB)
19:48:42.196 INFO BlockManagerInfo - Removed broadcast_301_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.6 MiB)
19:48:42.197 INFO BlockManagerInfo - Removed broadcast_300_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.7 MiB)
19:48:42.199 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:42.199 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:42.211 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:42.211 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.211 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.229 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948425569260445789447074_0749_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest6496458089132936373.sam.parts/_temporary/0/task_202507151948425569260445789447074_0749_m_000000
19:48:42.229 INFO SparkHadoopMapRedUtil - attempt_202507151948425569260445789447074_0749_m_000000_0: Committed. Elapsed time: 0 ms.
19:48:42.230 INFO Executor - Finished task 0.0 in stage 160.0 (TID 216). 1858 bytes result sent to driver
19:48:42.230 INFO TaskSetManager - Finished task 0.0 in stage 160.0 (TID 216) in 37 ms on localhost (executor driver) (1/1)
19:48:42.230 INFO TaskSchedulerImpl - Removed TaskSet 160.0, whose tasks have all completed, from pool
19:48:42.230 INFO DAGScheduler - ResultStage 160 (runJob at SparkHadoopWriter.scala:83) finished in 0.050 s
19:48:42.231 INFO DAGScheduler - Job 115 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.231 INFO TaskSchedulerImpl - Killing all running tasks in stage 160: Stage finished
19:48:42.231 INFO DAGScheduler - Job 115 finished: runJob at SparkHadoopWriter.scala:83, took 0.116261 s
19:48:42.231 INFO SparkHadoopWriter - Start to commit write Job job_202507151948425569260445789447074_0749.
19:48:42.235 INFO SparkHadoopWriter - Write Job job_202507151948425569260445789447074_0749 committed. Elapsed time: 4 ms.
19:48:42.243 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest6496458089132936373.sam
19:48:42.248 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest6496458089132936373.sam done
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:48:42.250 INFO MemoryStore - Block broadcast_308 stored as values in memory (estimated size 160.7 KiB, free 1918.4 MiB)
19:48:42.251 INFO MemoryStore - Block broadcast_308_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.4 MiB)
19:48:42.251 INFO BlockManagerInfo - Added broadcast_308_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.7 MiB)
19:48:42.252 INFO SparkContext - Created broadcast 308 from broadcast at SamSource.java:78
19:48:42.252 INFO MemoryStore - Block broadcast_309 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:42.259 INFO MemoryStore - Block broadcast_309_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:42.259 INFO BlockManagerInfo - Added broadcast_309_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:42.259 INFO SparkContext - Created broadcast 309 from newAPIHadoopFile at SamSource.java:108
19:48:42.261 INFO FileInputFormat - Total input files to process : 1
19:48:42.265 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:42.265 INFO DAGScheduler - Got job 116 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:42.265 INFO DAGScheduler - Final stage: ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:42.265 INFO DAGScheduler - Parents of final stage: List()
19:48:42.265 INFO DAGScheduler - Missing parents: List()
19:48:42.265 INFO DAGScheduler - Submitting ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:42.266 INFO MemoryStore - Block broadcast_310 stored as values in memory (estimated size 7.5 KiB, free 1918.0 MiB)
19:48:42.266 INFO MemoryStore - Block broadcast_310_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.0 MiB)
19:48:42.266 INFO BlockManagerInfo - Added broadcast_310_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.7 MiB)
19:48:42.266 INFO SparkContext - Created broadcast 310 from broadcast at DAGScheduler.scala:1580
19:48:42.266 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:42.266 INFO TaskSchedulerImpl - Adding task set 161.0 with 1 tasks resource profile 0
19:48:42.267 INFO TaskSetManager - Starting task 0.0 in stage 161.0 (TID 217) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7808 bytes)
19:48:42.267 INFO Executor - Running task 0.0 in stage 161.0 (TID 217)
19:48:42.268 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest6496458089132936373.sam:0+847558
19:48:42.283 INFO Executor - Finished task 0.0 in stage 161.0 (TID 217). 651483 bytes result sent to driver
19:48:42.284 INFO TaskSetManager - Finished task 0.0 in stage 161.0 (TID 217) in 17 ms on localhost (executor driver) (1/1)
19:48:42.284 INFO TaskSchedulerImpl - Removed TaskSet 161.0, whose tasks have all completed, from pool
19:48:42.284 INFO DAGScheduler - ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.019 s
19:48:42.284 INFO DAGScheduler - Job 116 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.285 INFO TaskSchedulerImpl - Killing all running tasks in stage 161: Stage finished
19:48:42.285 INFO DAGScheduler - Job 116 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.020027 s
19:48:42.300 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:42.300 INFO DAGScheduler - Got job 117 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:42.300 INFO DAGScheduler - Final stage: ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185)
19:48:42.300 INFO DAGScheduler - Parents of final stage: List()
19:48:42.300 INFO DAGScheduler - Missing parents: List()
19:48:42.300 INFO DAGScheduler - Submitting ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:42.317 INFO MemoryStore - Block broadcast_311 stored as values in memory (estimated size 426.1 KiB, free 1917.6 MiB)
19:48:42.318 INFO MemoryStore - Block broadcast_311_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
19:48:42.319 INFO BlockManagerInfo - Added broadcast_311_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:42.319 INFO SparkContext - Created broadcast 311 from broadcast at DAGScheduler.scala:1580
19:48:42.319 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:42.319 INFO TaskSchedulerImpl - Adding task set 162.0 with 1 tasks resource profile 0
19:48:42.320 INFO TaskSetManager - Starting task 0.0 in stage 162.0 (TID 218) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:42.320 INFO Executor - Running task 0.0 in stage 162.0 (TID 218)
19:48:42.350 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:42.360 INFO Executor - Finished task 0.0 in stage 162.0 (TID 218). 989 bytes result sent to driver
19:48:42.360 INFO TaskSetManager - Finished task 0.0 in stage 162.0 (TID 218) in 41 ms on localhost (executor driver) (1/1)
19:48:42.360 INFO TaskSchedulerImpl - Removed TaskSet 162.0, whose tasks have all completed, from pool
19:48:42.360 INFO DAGScheduler - ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
19:48:42.360 INFO DAGScheduler - Job 117 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.361 INFO TaskSchedulerImpl - Killing all running tasks in stage 162: Stage finished
19:48:42.361 INFO DAGScheduler - Job 117 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060578 s
19:48:42.365 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:42.365 INFO DAGScheduler - Got job 118 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:42.365 INFO DAGScheduler - Final stage: ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185)
19:48:42.365 INFO DAGScheduler - Parents of final stage: List()
19:48:42.365 INFO DAGScheduler - Missing parents: List()
19:48:42.365 INFO DAGScheduler - Submitting ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:42.365 INFO MemoryStore - Block broadcast_312 stored as values in memory (estimated size 7.4 KiB, free 1917.4 MiB)
19:48:42.366 INFO MemoryStore - Block broadcast_312_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1917.4 MiB)
19:48:42.366 INFO BlockManagerInfo - Added broadcast_312_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.5 MiB)
19:48:42.366 INFO SparkContext - Created broadcast 312 from broadcast at DAGScheduler.scala:1580
19:48:42.366 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:42.366 INFO TaskSchedulerImpl - Adding task set 163.0 with 1 tasks resource profile 0
19:48:42.367 INFO TaskSetManager - Starting task 0.0 in stage 163.0 (TID 219) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7808 bytes)
19:48:42.367 INFO Executor - Running task 0.0 in stage 163.0 (TID 219)
19:48:42.368 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest6496458089132936373.sam:0+847558
19:48:42.375 INFO Executor - Finished task 0.0 in stage 163.0 (TID 219). 946 bytes result sent to driver
19:48:42.375 INFO TaskSetManager - Finished task 0.0 in stage 163.0 (TID 219) in 8 ms on localhost (executor driver) (1/1)
19:48:42.375 INFO TaskSchedulerImpl - Removed TaskSet 163.0, whose tasks have all completed, from pool
19:48:42.375 INFO DAGScheduler - ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.010 s
19:48:42.375 INFO DAGScheduler - Job 118 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.375 INFO TaskSchedulerImpl - Killing all running tasks in stage 163: Stage finished
19:48:42.375 INFO DAGScheduler - Job 118 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.010684 s
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:48:42.378 INFO MemoryStore - Block broadcast_313 stored as values in memory (estimated size 21.0 KiB, free 1917.4 MiB)
19:48:42.379 INFO MemoryStore - Block broadcast_313_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1917.4 MiB)
19:48:42.379 INFO BlockManagerInfo - Added broadcast_313_piece0 in memory on localhost:36125 (size: 2.4 KiB, free: 1919.5 MiB)
19:48:42.379 INFO SparkContext - Created broadcast 313 from broadcast at SamSource.java:78
19:48:42.380 INFO MemoryStore - Block broadcast_314 stored as values in memory (estimated size 298.0 KiB, free 1917.1 MiB)
19:48:42.391 INFO MemoryStore - Block broadcast_314_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.1 MiB)
19:48:42.391 INFO BlockManagerInfo - Added broadcast_314_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.4 MiB)
19:48:42.391 INFO SparkContext - Created broadcast 314 from newAPIHadoopFile at SamSource.java:108
19:48:42.396 INFO FileInputFormat - Total input files to process : 1
19:48:42.401 INFO SparkContext - Starting job: collect at SparkUtils.java:205
19:48:42.401 INFO DAGScheduler - Got job 119 (collect at SparkUtils.java:205) with 1 output partitions
19:48:42.401 INFO DAGScheduler - Final stage: ResultStage 164 (collect at SparkUtils.java:205)
19:48:42.401 INFO DAGScheduler - Parents of final stage: List()
19:48:42.401 INFO DAGScheduler - Missing parents: List()
19:48:42.401 INFO DAGScheduler - Submitting ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188), which has no missing parents
19:48:42.402 INFO MemoryStore - Block broadcast_315 stored as values in memory (estimated size 7.9 KiB, free 1917.1 MiB)
19:48:42.402 INFO MemoryStore - Block broadcast_315_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1917.1 MiB)
19:48:42.402 INFO BlockManagerInfo - Added broadcast_315_piece0 in memory on localhost:36125 (size: 3.9 KiB, free: 1919.4 MiB)
19:48:42.402 INFO SparkContext - Created broadcast 315 from broadcast at DAGScheduler.scala:1580
19:48:42.402 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
19:48:42.402 INFO TaskSchedulerImpl - Adding task set 164.0 with 1 tasks resource profile 0
19:48:42.403 INFO TaskSetManager - Starting task 0.0 in stage 164.0 (TID 220) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
19:48:42.403 INFO Executor - Running task 0.0 in stage 164.0 (TID 220)
19:48:42.404 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
19:48:42.407 INFO Executor - Finished task 0.0 in stage 164.0 (TID 220). 1657 bytes result sent to driver
19:48:42.408 INFO TaskSetManager - Finished task 0.0 in stage 164.0 (TID 220) in 5 ms on localhost (executor driver) (1/1)
19:48:42.408 INFO TaskSchedulerImpl - Removed TaskSet 164.0, whose tasks have all completed, from pool
19:48:42.408 INFO DAGScheduler - ResultStage 164 (collect at SparkUtils.java:205) finished in 0.007 s
19:48:42.408 INFO DAGScheduler - Job 119 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.408 INFO TaskSchedulerImpl - Killing all running tasks in stage 164: Stage finished
19:48:42.408 INFO DAGScheduler - Job 119 finished: collect at SparkUtils.java:205, took 0.007412 s
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:48:42.413 INFO MemoryStore - Block broadcast_316 stored as values in memory (estimated size 21.0 KiB, free 1917.0 MiB)
19:48:42.414 INFO MemoryStore - Block broadcast_316_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1917.0 MiB)
19:48:42.414 INFO BlockManagerInfo - Added broadcast_316_piece0 in memory on localhost:36125 (size: 2.4 KiB, free: 1919.4 MiB)
19:48:42.414 INFO SparkContext - Created broadcast 316 from broadcast at SamSource.java:78
19:48:42.415 INFO MemoryStore - Block broadcast_317 stored as values in memory (estimated size 298.0 KiB, free 1916.7 MiB)
19:48:42.425 INFO MemoryStore - Block broadcast_317_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.7 MiB)
19:48:42.426 INFO BlockManagerInfo - Added broadcast_317_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.4 MiB)
19:48:42.426 INFO SparkContext - Created broadcast 317 from newAPIHadoopFile at SamSource.java:108
19:48:42.428 INFO MemoryStore - Block broadcast_318 stored as values in memory (estimated size 21.0 KiB, free 1916.7 MiB)
19:48:42.428 INFO MemoryStore - Block broadcast_318_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1916.7 MiB)
19:48:42.428 INFO BlockManagerInfo - Added broadcast_318_piece0 in memory on localhost:36125 (size: 2.4 KiB, free: 1919.4 MiB)
19:48:42.429 INFO SparkContext - Created broadcast 318 from broadcast at ReadsSparkSink.java:133
19:48:42.429 INFO MemoryStore - Block broadcast_319 stored as values in memory (estimated size 21.5 KiB, free 1916.6 MiB)
19:48:42.434 INFO MemoryStore - Block broadcast_319_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1916.6 MiB)
19:48:42.434 INFO BlockManagerInfo - Added broadcast_319_piece0 in memory on localhost:36125 (size: 2.4 KiB, free: 1919.4 MiB)
19:48:42.434 INFO BlockManagerInfo - Removed broadcast_312_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.4 MiB)
19:48:42.434 INFO SparkContext - Created broadcast 319 from broadcast at BamSink.java:76
19:48:42.435 INFO BlockManagerInfo - Removed broadcast_308_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:42.435 INFO BlockManagerInfo - Removed broadcast_317_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.4 MiB)
19:48:42.436 INFO BlockManagerInfo - Removed broadcast_315_piece0 on localhost:36125 in memory (size: 3.9 KiB, free: 1919.5 MiB)
19:48:42.437 INFO BlockManagerInfo - Removed broadcast_307_piece0 on localhost:36125 in memory (size: 66.9 KiB, free: 1919.5 MiB)
19:48:42.437 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:42.437 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.437 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.437 INFO BlockManagerInfo - Removed broadcast_305_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:42.440 INFO BlockManagerInfo - Removed broadcast_306_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.7 MiB)
19:48:42.441 INFO BlockManagerInfo - Removed broadcast_316_piece0 on localhost:36125 in memory (size: 2.4 KiB, free: 1919.7 MiB)
19:48:42.442 INFO BlockManagerInfo - Removed broadcast_311_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:42.442 INFO BlockManagerInfo - Removed broadcast_303_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:42.443 INFO BlockManagerInfo - Removed broadcast_309_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:42.443 INFO BlockManagerInfo - Removed broadcast_310_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.9 MiB)
19:48:42.460 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:42.460 INFO DAGScheduler - Got job 120 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:42.460 INFO DAGScheduler - Final stage: ResultStage 165 (runJob at SparkHadoopWriter.scala:83)
19:48:42.460 INFO DAGScheduler - Parents of final stage: List()
19:48:42.461 INFO DAGScheduler - Missing parents: List()
19:48:42.461 INFO DAGScheduler - Submitting ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91), which has no missing parents
19:48:42.467 INFO MemoryStore - Block broadcast_320 stored as values in memory (estimated size 152.3 KiB, free 1919.4 MiB)
19:48:42.468 INFO MemoryStore - Block broadcast_320_piece0 stored as bytes in memory (estimated size 56.4 KiB, free 1919.4 MiB)
19:48:42.468 INFO BlockManagerInfo - Added broadcast_320_piece0 in memory on localhost:36125 (size: 56.4 KiB, free: 1919.9 MiB)
19:48:42.468 INFO SparkContext - Created broadcast 320 from broadcast at DAGScheduler.scala:1580
19:48:42.468 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:42.468 INFO TaskSchedulerImpl - Adding task set 165.0 with 1 tasks resource profile 0
19:48:42.469 INFO TaskSetManager - Starting task 0.0 in stage 165.0 (TID 221) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
19:48:42.469 INFO Executor - Running task 0.0 in stage 165.0 (TID 221)
19:48:42.475 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
19:48:42.479 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:42.479 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.479 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.479 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:42.479 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.479 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.506 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948424597871930292002754_0770_r_000000_0' to file:/tmp/ReadsSparkSinkNotSorting14609238682871361264.bam.parts/_temporary/0/task_202507151948424597871930292002754_0770_r_000000
19:48:42.506 INFO SparkHadoopMapRedUtil - attempt_202507151948424597871930292002754_0770_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:42.506 INFO Executor - Finished task 0.0 in stage 165.0 (TID 221). 1084 bytes result sent to driver
19:48:42.507 INFO TaskSetManager - Finished task 0.0 in stage 165.0 (TID 221) in 38 ms on localhost (executor driver) (1/1)
19:48:42.507 INFO TaskSchedulerImpl - Removed TaskSet 165.0, whose tasks have all completed, from pool
19:48:42.507 INFO DAGScheduler - ResultStage 165 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
19:48:42.507 INFO DAGScheduler - Job 120 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.507 INFO TaskSchedulerImpl - Killing all running tasks in stage 165: Stage finished
19:48:42.507 INFO DAGScheduler - Job 120 finished: runJob at SparkHadoopWriter.scala:83, took 0.047098 s
19:48:42.507 INFO SparkHadoopWriter - Start to commit write Job job_202507151948424597871930292002754_0770.
19:48:42.512 INFO SparkHadoopWriter - Write Job job_202507151948424597871930292002754_0770 committed. Elapsed time: 4 ms.
19:48:42.524 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkNotSorting14609238682871361264.bam
19:48:42.528 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkNotSorting14609238682871361264.bam done
19:48:42.528 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkNotSorting14609238682871361264.bam.parts/ to /tmp/ReadsSparkSinkNotSorting14609238682871361264.bam.sbi
19:48:42.533 INFO IndexFileMerger - Done merging .sbi files
19:48:42.534 INFO MemoryStore - Block broadcast_321 stored as values in memory (estimated size 192.0 B, free 1919.4 MiB)
19:48:42.534 INFO MemoryStore - Block broadcast_321_piece0 stored as bytes in memory (estimated size 127.0 B, free 1919.4 MiB)
19:48:42.534 INFO BlockManagerInfo - Added broadcast_321_piece0 in memory on localhost:36125 (size: 127.0 B, free: 1919.9 MiB)
19:48:42.534 INFO SparkContext - Created broadcast 321 from broadcast at BamSource.java:104
19:48:42.535 INFO MemoryStore - Block broadcast_322 stored as values in memory (estimated size 297.9 KiB, free 1919.1 MiB)
19:48:42.541 INFO MemoryStore - Block broadcast_322_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.0 MiB)
19:48:42.541 INFO BlockManagerInfo - Added broadcast_322_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.8 MiB)
19:48:42.542 INFO SparkContext - Created broadcast 322 from newAPIHadoopFile at PathSplitSource.java:96
19:48:42.551 INFO FileInputFormat - Total input files to process : 1
19:48:42.565 INFO SparkContext - Starting job: collect at SparkUtils.java:205
19:48:42.565 INFO DAGScheduler - Got job 121 (collect at SparkUtils.java:205) with 1 output partitions
19:48:42.565 INFO DAGScheduler - Final stage: ResultStage 166 (collect at SparkUtils.java:205)
19:48:42.565 INFO DAGScheduler - Parents of final stage: List()
19:48:42.565 INFO DAGScheduler - Missing parents: List()
19:48:42.565 INFO DAGScheduler - Submitting ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188), which has no missing parents
19:48:42.571 INFO MemoryStore - Block broadcast_323 stored as values in memory (estimated size 148.6 KiB, free 1918.9 MiB)
19:48:42.572 INFO MemoryStore - Block broadcast_323_piece0 stored as bytes in memory (estimated size 54.7 KiB, free 1918.8 MiB)
19:48:42.572 INFO BlockManagerInfo - Added broadcast_323_piece0 in memory on localhost:36125 (size: 54.7 KiB, free: 1919.8 MiB)
19:48:42.572 INFO SparkContext - Created broadcast 323 from broadcast at DAGScheduler.scala:1580
19:48:42.573 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
19:48:42.573 INFO TaskSchedulerImpl - Adding task set 166.0 with 1 tasks resource profile 0
19:48:42.573 INFO TaskSetManager - Starting task 0.0 in stage 166.0 (TID 222) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
19:48:42.573 INFO Executor - Running task 0.0 in stage 166.0 (TID 222)
19:48:42.590 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting14609238682871361264.bam:0+59395
19:48:42.591 INFO Executor - Finished task 0.0 in stage 166.0 (TID 222). 1700 bytes result sent to driver
19:48:42.592 INFO TaskSetManager - Finished task 0.0 in stage 166.0 (TID 222) in 19 ms on localhost (executor driver) (1/1)
19:48:42.592 INFO TaskSchedulerImpl - Removed TaskSet 166.0, whose tasks have all completed, from pool
19:48:42.592 INFO DAGScheduler - ResultStage 166 (collect at SparkUtils.java:205) finished in 0.026 s
19:48:42.592 INFO DAGScheduler - Job 121 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.592 INFO TaskSchedulerImpl - Killing all running tasks in stage 166: Stage finished
19:48:42.592 INFO DAGScheduler - Job 121 finished: collect at SparkUtils.java:205, took 0.026985 s
19:48:42.608 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:91
19:48:42.608 INFO DAGScheduler - Got job 122 (collect at ReadsSparkSinkUnitTest.java:91) with 1 output partitions
19:48:42.608 INFO DAGScheduler - Final stage: ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91)
19:48:42.608 INFO DAGScheduler - Parents of final stage: List()
19:48:42.608 INFO DAGScheduler - Missing parents: List()
19:48:42.608 INFO DAGScheduler - Submitting ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244), which has no missing parents
19:48:42.619 INFO MemoryStore - Block broadcast_324 stored as values in memory (estimated size 149.8 KiB, free 1918.7 MiB)
19:48:42.619 INFO MemoryStore - Block broadcast_324_piece0 stored as bytes in memory (estimated size 55.2 KiB, free 1918.6 MiB)
19:48:42.620 INFO BlockManagerInfo - Added broadcast_324_piece0 in memory on localhost:36125 (size: 55.2 KiB, free: 1919.7 MiB)
19:48:42.620 INFO SparkContext - Created broadcast 324 from broadcast at DAGScheduler.scala:1580
19:48:42.620 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
19:48:42.620 INFO TaskSchedulerImpl - Adding task set 167.0 with 1 tasks resource profile 0
19:48:42.620 INFO TaskSetManager - Starting task 0.0 in stage 167.0 (TID 223) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes)
19:48:42.621 INFO Executor - Running task 0.0 in stage 167.0 (TID 223)
19:48:42.632 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting14609238682871361264.bam:0+59395
19:48:42.634 INFO Executor - Finished task 0.0 in stage 167.0 (TID 223). 192451 bytes result sent to driver
19:48:42.634 INFO TaskSetManager - Finished task 0.0 in stage 167.0 (TID 223) in 14 ms on localhost (executor driver) (1/1)
19:48:42.634 INFO TaskSchedulerImpl - Removed TaskSet 167.0, whose tasks have all completed, from pool
19:48:42.635 INFO DAGScheduler - ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91) finished in 0.026 s
19:48:42.635 INFO DAGScheduler - Job 122 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.635 INFO TaskSchedulerImpl - Killing all running tasks in stage 167: Stage finished
19:48:42.635 INFO DAGScheduler - Job 122 finished: collect at ReadsSparkSinkUnitTest.java:91, took 0.026694 s
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-07-15 19:48:42 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:48:42.636 INFO MemoryStore - Block broadcast_325 stored as values in memory (estimated size 21.0 KiB, free 1918.6 MiB)
19:48:42.637 INFO MemoryStore - Block broadcast_325_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1918.6 MiB)
19:48:42.637 INFO BlockManagerInfo - Added broadcast_325_piece0 in memory on localhost:36125 (size: 2.4 KiB, free: 1919.7 MiB)
19:48:42.637 INFO SparkContext - Created broadcast 325 from broadcast at SamSource.java:78
19:48:42.638 INFO MemoryStore - Block broadcast_326 stored as values in memory (estimated size 298.0 KiB, free 1918.3 MiB)
19:48:42.644 INFO MemoryStore - Block broadcast_326_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.3 MiB)
19:48:42.644 INFO BlockManagerInfo - Added broadcast_326_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.7 MiB)
19:48:42.644 INFO SparkContext - Created broadcast 326 from newAPIHadoopFile at SamSource.java:108
19:48:42.647 INFO FileInputFormat - Total input files to process : 1
19:48:42.651 INFO SparkContext - Starting job: collect at SparkUtils.java:205
19:48:42.651 INFO DAGScheduler - Got job 123 (collect at SparkUtils.java:205) with 1 output partitions
19:48:42.651 INFO DAGScheduler - Final stage: ResultStage 168 (collect at SparkUtils.java:205)
19:48:42.651 INFO DAGScheduler - Parents of final stage: List()
19:48:42.651 INFO DAGScheduler - Missing parents: List()
19:48:42.651 INFO DAGScheduler - Submitting ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188), which has no missing parents
19:48:42.651 INFO MemoryStore - Block broadcast_327 stored as values in memory (estimated size 7.9 KiB, free 1918.3 MiB)
19:48:42.652 INFO MemoryStore - Block broadcast_327_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1918.3 MiB)
19:48:42.652 INFO BlockManagerInfo - Added broadcast_327_piece0 in memory on localhost:36125 (size: 3.9 KiB, free: 1919.7 MiB)
19:48:42.652 INFO SparkContext - Created broadcast 327 from broadcast at DAGScheduler.scala:1580
19:48:42.652 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
19:48:42.652 INFO TaskSchedulerImpl - Adding task set 168.0 with 1 tasks resource profile 0
19:48:42.653 INFO TaskSetManager - Starting task 0.0 in stage 168.0 (TID 224) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
19:48:42.653 INFO Executor - Running task 0.0 in stage 168.0 (TID 224)
19:48:42.654 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
19:48:42.656 INFO Executor - Finished task 0.0 in stage 168.0 (TID 224). 1657 bytes result sent to driver
19:48:42.656 INFO TaskSetManager - Finished task 0.0 in stage 168.0 (TID 224) in 3 ms on localhost (executor driver) (1/1)
19:48:42.656 INFO TaskSchedulerImpl - Removed TaskSet 168.0, whose tasks have all completed, from pool
19:48:42.656 INFO DAGScheduler - ResultStage 168 (collect at SparkUtils.java:205) finished in 0.005 s
19:48:42.657 INFO DAGScheduler - Job 123 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.657 INFO TaskSchedulerImpl - Killing all running tasks in stage 168: Stage finished
19:48:42.657 INFO DAGScheduler - Job 123 finished: collect at SparkUtils.java:205, took 0.006199 s
19:48:42.662 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:94
19:48:42.662 INFO DAGScheduler - Got job 124 (collect at ReadsSparkSinkUnitTest.java:94) with 1 output partitions
19:48:42.662 INFO DAGScheduler - Final stage: ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94)
19:48:42.662 INFO DAGScheduler - Parents of final stage: List()
19:48:42.662 INFO DAGScheduler - Missing parents: List()
19:48:42.662 INFO DAGScheduler - Submitting ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244), which has no missing parents
19:48:42.663 INFO MemoryStore - Block broadcast_328 stored as values in memory (estimated size 9.6 KiB, free 1918.3 MiB)
19:48:42.663 INFO MemoryStore - Block broadcast_328_piece0 stored as bytes in memory (estimated size 4.4 KiB, free 1918.3 MiB)
19:48:42.663 INFO BlockManagerInfo - Added broadcast_328_piece0 in memory on localhost:36125 (size: 4.4 KiB, free: 1919.7 MiB)
19:48:42.663 INFO SparkContext - Created broadcast 328 from broadcast at DAGScheduler.scala:1580
19:48:42.663 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
19:48:42.664 INFO TaskSchedulerImpl - Adding task set 169.0 with 1 tasks resource profile 0
19:48:42.664 INFO TaskSetManager - Starting task 0.0 in stage 169.0 (TID 225) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
19:48:42.664 INFO Executor - Running task 0.0 in stage 169.0 (TID 225)
19:48:42.665 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
19:48:42.671 INFO Executor - Finished task 0.0 in stage 169.0 (TID 225). 192451 bytes result sent to driver
19:48:42.672 INFO TaskSetManager - Finished task 0.0 in stage 169.0 (TID 225) in 8 ms on localhost (executor driver) (1/1)
19:48:42.672 INFO TaskSchedulerImpl - Removed TaskSet 169.0, whose tasks have all completed, from pool
19:48:42.672 INFO DAGScheduler - ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94) finished in 0.010 s
19:48:42.672 INFO DAGScheduler - Job 124 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.672 INFO TaskSchedulerImpl - Killing all running tasks in stage 169: Stage finished
19:48:42.672 INFO DAGScheduler - Job 124 finished: collect at ReadsSparkSinkUnitTest.java:94, took 0.010406 s
19:48:42.681 INFO MemoryStore - Block broadcast_329 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
19:48:42.687 INFO MemoryStore - Block broadcast_329_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
19:48:42.687 INFO BlockManagerInfo - Added broadcast_329_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:42.688 INFO SparkContext - Created broadcast 329 from newAPIHadoopFile at PathSplitSource.java:96
19:48:42.709 INFO MemoryStore - Block broadcast_330 stored as values in memory (estimated size 297.9 KiB, free 1917.6 MiB)
19:48:42.715 INFO MemoryStore - Block broadcast_330_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.6 MiB)
19:48:42.715 INFO BlockManagerInfo - Added broadcast_330_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:42.715 INFO SparkContext - Created broadcast 330 from newAPIHadoopFile at PathSplitSource.java:96
19:48:42.735 INFO FileInputFormat - Total input files to process : 1
19:48:42.736 INFO MemoryStore - Block broadcast_331 stored as values in memory (estimated size 160.7 KiB, free 1917.4 MiB)
19:48:42.741 INFO BlockManagerInfo - Removed broadcast_327_piece0 on localhost:36125 in memory (size: 3.9 KiB, free: 1919.6 MiB)
19:48:42.741 INFO MemoryStore - Block broadcast_331_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.4 MiB)
19:48:42.741 INFO BlockManagerInfo - Added broadcast_331_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.6 MiB)
19:48:42.742 INFO SparkContext - Created broadcast 331 from broadcast at ReadsSparkSink.java:133
19:48:42.742 INFO BlockManagerInfo - Removed broadcast_314_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.6 MiB)
19:48:42.742 INFO BlockManagerInfo - Removed broadcast_322_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:42.743 INFO BlockManagerInfo - Removed broadcast_313_piece0 on localhost:36125 in memory (size: 2.4 KiB, free: 1919.7 MiB)
19:48:42.743 INFO BlockManagerInfo - Removed broadcast_324_piece0 on localhost:36125 in memory (size: 55.2 KiB, free: 1919.7 MiB)
19:48:42.743 INFO MemoryStore - Block broadcast_332 stored as values in memory (estimated size 163.2 KiB, free 1918.2 MiB)
19:48:42.744 INFO BlockManagerInfo - Removed broadcast_323_piece0 on localhost:36125 in memory (size: 54.7 KiB, free: 1919.8 MiB)
19:48:42.744 INFO MemoryStore - Block broadcast_332_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.4 MiB)
19:48:42.744 INFO BlockManagerInfo - Added broadcast_332_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.8 MiB)
19:48:42.745 INFO BlockManagerInfo - Removed broadcast_318_piece0 on localhost:36125 in memory (size: 2.4 KiB, free: 1919.8 MiB)
19:48:42.745 INFO SparkContext - Created broadcast 332 from broadcast at BamSink.java:76
19:48:42.745 INFO BlockManagerInfo - Removed broadcast_321_piece0 on localhost:36125 in memory (size: 127.0 B, free: 1919.8 MiB)
19:48:42.746 INFO BlockManagerInfo - Removed broadcast_325_piece0 on localhost:36125 in memory (size: 2.4 KiB, free: 1919.8 MiB)
19:48:42.747 INFO BlockManagerInfo - Removed broadcast_326_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.8 MiB)
19:48:42.747 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:42.747 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.747 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.747 INFO BlockManagerInfo - Removed broadcast_330_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:42.748 INFO BlockManagerInfo - Removed broadcast_328_piece0 on localhost:36125 in memory (size: 4.4 KiB, free: 1919.9 MiB)
19:48:42.748 INFO BlockManagerInfo - Removed broadcast_320_piece0 on localhost:36125 in memory (size: 56.4 KiB, free: 1919.9 MiB)
19:48:42.749 INFO BlockManagerInfo - Removed broadcast_319_piece0 on localhost:36125 in memory (size: 2.4 KiB, free: 1919.9 MiB)
19:48:42.765 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:42.765 INFO DAGScheduler - Registering RDD 803 (mapToPair at SparkUtils.java:161) as input to shuffle 34
19:48:42.765 INFO DAGScheduler - Got job 125 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:42.765 INFO DAGScheduler - Final stage: ResultStage 171 (runJob at SparkHadoopWriter.scala:83)
19:48:42.765 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 170)
19:48:42.765 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 170)
19:48:42.765 INFO DAGScheduler - Submitting ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:42.783 INFO MemoryStore - Block broadcast_333 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
19:48:42.784 INFO MemoryStore - Block broadcast_333_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
19:48:42.784 INFO BlockManagerInfo - Added broadcast_333_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.8 MiB)
19:48:42.784 INFO SparkContext - Created broadcast 333 from broadcast at DAGScheduler.scala:1580
19:48:42.784 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:42.784 INFO TaskSchedulerImpl - Adding task set 170.0 with 1 tasks resource profile 0
19:48:42.785 INFO TaskSetManager - Starting task 0.0 in stage 170.0 (TID 226) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:42.785 INFO Executor - Running task 0.0 in stage 170.0 (TID 226)
19:48:42.816 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:42.832 INFO Executor - Finished task 0.0 in stage 170.0 (TID 226). 1148 bytes result sent to driver
19:48:42.832 INFO TaskSetManager - Finished task 0.0 in stage 170.0 (TID 226) in 47 ms on localhost (executor driver) (1/1)
19:48:42.832 INFO TaskSchedulerImpl - Removed TaskSet 170.0, whose tasks have all completed, from pool
19:48:42.832 INFO DAGScheduler - ShuffleMapStage 170 (mapToPair at SparkUtils.java:161) finished in 0.066 s
19:48:42.832 INFO DAGScheduler - looking for newly runnable stages
19:48:42.832 INFO DAGScheduler - running: HashSet()
19:48:42.832 INFO DAGScheduler - waiting: HashSet(ResultStage 171)
19:48:42.832 INFO DAGScheduler - failed: HashSet()
19:48:42.832 INFO DAGScheduler - Submitting ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91), which has no missing parents
19:48:42.839 INFO MemoryStore - Block broadcast_334 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
19:48:42.840 INFO MemoryStore - Block broadcast_334_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
19:48:42.840 INFO BlockManagerInfo - Added broadcast_334_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.7 MiB)
19:48:42.840 INFO SparkContext - Created broadcast 334 from broadcast at DAGScheduler.scala:1580
19:48:42.840 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:42.840 INFO TaskSchedulerImpl - Adding task set 171.0 with 1 tasks resource profile 0
19:48:42.841 INFO TaskSetManager - Starting task 0.0 in stage 171.0 (TID 227) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:42.841 INFO Executor - Running task 0.0 in stage 171.0 (TID 227)
19:48:42.845 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:42.845 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:42.856 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:42.856 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.856 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.856 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:42.856 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:42.856 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:42.883 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948422562875425096855805_0808_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace17874004279595336545/_temporary/0/task_202507151948422562875425096855805_0808_r_000000
19:48:42.883 INFO SparkHadoopMapRedUtil - attempt_202507151948422562875425096855805_0808_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:42.884 INFO Executor - Finished task 0.0 in stage 171.0 (TID 227). 1858 bytes result sent to driver
19:48:42.884 INFO TaskSetManager - Finished task 0.0 in stage 171.0 (TID 227) in 44 ms on localhost (executor driver) (1/1)
19:48:42.884 INFO TaskSchedulerImpl - Removed TaskSet 171.0, whose tasks have all completed, from pool
19:48:42.884 INFO DAGScheduler - ResultStage 171 (runJob at SparkHadoopWriter.scala:83) finished in 0.051 s
19:48:42.884 INFO DAGScheduler - Job 125 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.884 INFO TaskSchedulerImpl - Killing all running tasks in stage 171: Stage finished
19:48:42.884 INFO DAGScheduler - Job 125 finished: runJob at SparkHadoopWriter.scala:83, took 0.119765 s
19:48:42.885 INFO SparkHadoopWriter - Start to commit write Job job_202507151948422562875425096855805_0808.
19:48:42.891 INFO SparkHadoopWriter - Write Job job_202507151948422562875425096855805_0808 committed. Elapsed time: 5 ms.
19:48:42.903 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest116276004243622841112.bam
19:48:42.908 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest116276004243622841112.bam done
19:48:42.908 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace17874004279595336545 to /tmp/ReadsSparkSinkUnitTest116276004243622841112.bam.sbi
19:48:42.912 INFO IndexFileMerger - Done merging .sbi files
19:48:42.912 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace17874004279595336545 to /tmp/ReadsSparkSinkUnitTest116276004243622841112.bam.bai
19:48:42.917 INFO IndexFileMerger - Done merging .bai files
19:48:42.919 INFO MemoryStore - Block broadcast_335 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
19:48:42.920 INFO MemoryStore - Block broadcast_335_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
19:48:42.920 INFO BlockManagerInfo - Added broadcast_335_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:48:42.920 INFO SparkContext - Created broadcast 335 from broadcast at BamSource.java:104
19:48:42.921 INFO MemoryStore - Block broadcast_336 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:42.927 INFO MemoryStore - Block broadcast_336_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:42.927 INFO BlockManagerInfo - Added broadcast_336_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:42.927 INFO SparkContext - Created broadcast 336 from newAPIHadoopFile at PathSplitSource.java:96
19:48:42.936 INFO FileInputFormat - Total input files to process : 1
19:48:42.950 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:42.950 INFO DAGScheduler - Got job 126 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:42.950 INFO DAGScheduler - Final stage: ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:42.950 INFO DAGScheduler - Parents of final stage: List()
19:48:42.951 INFO DAGScheduler - Missing parents: List()
19:48:42.951 INFO DAGScheduler - Submitting ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:42.957 INFO MemoryStore - Block broadcast_337 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
19:48:42.957 INFO MemoryStore - Block broadcast_337_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
19:48:42.957 INFO BlockManagerInfo - Added broadcast_337_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:42.958 INFO SparkContext - Created broadcast 337 from broadcast at DAGScheduler.scala:1580
19:48:42.958 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:42.958 INFO TaskSchedulerImpl - Adding task set 172.0 with 1 tasks resource profile 0
19:48:42.958 INFO TaskSetManager - Starting task 0.0 in stage 172.0 (TID 228) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:42.959 INFO Executor - Running task 0.0 in stage 172.0 (TID 228)
19:48:42.970 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest116276004243622841112.bam:0+237038
19:48:42.975 INFO Executor - Finished task 0.0 in stage 172.0 (TID 228). 651526 bytes result sent to driver
19:48:42.976 INFO TaskSetManager - Finished task 0.0 in stage 172.0 (TID 228) in 18 ms on localhost (executor driver) (1/1)
19:48:42.976 INFO TaskSchedulerImpl - Removed TaskSet 172.0, whose tasks have all completed, from pool
19:48:42.976 INFO DAGScheduler - ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
19:48:42.976 INFO DAGScheduler - Job 126 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:42.976 INFO TaskSchedulerImpl - Killing all running tasks in stage 172: Stage finished
19:48:42.976 INFO DAGScheduler - Job 126 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026015 s
19:48:42.985 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:42.986 INFO DAGScheduler - Got job 127 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:42.986 INFO DAGScheduler - Final stage: ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185)
19:48:42.986 INFO DAGScheduler - Parents of final stage: List()
19:48:42.986 INFO DAGScheduler - Missing parents: List()
19:48:42.986 INFO DAGScheduler - Submitting ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.011 INFO MemoryStore - Block broadcast_338 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:43.012 INFO MemoryStore - Block broadcast_338_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:48:43.012 INFO BlockManagerInfo - Added broadcast_338_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:43.012 INFO SparkContext - Created broadcast 338 from broadcast at DAGScheduler.scala:1580
19:48:43.012 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.012 INFO TaskSchedulerImpl - Adding task set 173.0 with 1 tasks resource profile 0
19:48:43.013 INFO TaskSetManager - Starting task 0.0 in stage 173.0 (TID 229) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:43.013 INFO Executor - Running task 0.0 in stage 173.0 (TID 229)
19:48:43.042 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:43.052 INFO Executor - Finished task 0.0 in stage 173.0 (TID 229). 989 bytes result sent to driver
19:48:43.052 INFO TaskSetManager - Finished task 0.0 in stage 173.0 (TID 229) in 39 ms on localhost (executor driver) (1/1)
19:48:43.052 INFO TaskSchedulerImpl - Removed TaskSet 173.0, whose tasks have all completed, from pool
19:48:43.052 INFO DAGScheduler - ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.066 s
19:48:43.052 INFO DAGScheduler - Job 127 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.052 INFO TaskSchedulerImpl - Killing all running tasks in stage 173: Stage finished
19:48:43.052 INFO DAGScheduler - Job 127 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066950 s
19:48:43.057 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:43.057 INFO DAGScheduler - Got job 128 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:43.057 INFO DAGScheduler - Final stage: ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185)
19:48:43.057 INFO DAGScheduler - Parents of final stage: List()
19:48:43.057 INFO DAGScheduler - Missing parents: List()
19:48:43.057 INFO DAGScheduler - Submitting ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.063 INFO MemoryStore - Block broadcast_339 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:48:43.064 INFO MemoryStore - Block broadcast_339_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
19:48:43.064 INFO BlockManagerInfo - Added broadcast_339_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.4 MiB)
19:48:43.064 INFO SparkContext - Created broadcast 339 from broadcast at DAGScheduler.scala:1580
19:48:43.064 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.064 INFO TaskSchedulerImpl - Adding task set 174.0 with 1 tasks resource profile 0
19:48:43.065 INFO TaskSetManager - Starting task 0.0 in stage 174.0 (TID 230) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:43.065 INFO Executor - Running task 0.0 in stage 174.0 (TID 230)
19:48:43.076 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest116276004243622841112.bam:0+237038
19:48:43.079 INFO Executor - Finished task 0.0 in stage 174.0 (TID 230). 989 bytes result sent to driver
19:48:43.079 INFO TaskSetManager - Finished task 0.0 in stage 174.0 (TID 230) in 14 ms on localhost (executor driver) (1/1)
19:48:43.079 INFO TaskSchedulerImpl - Removed TaskSet 174.0, whose tasks have all completed, from pool
19:48:43.079 INFO DAGScheduler - ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.021 s
19:48:43.079 INFO DAGScheduler - Job 128 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.079 INFO TaskSchedulerImpl - Killing all running tasks in stage 174: Stage finished
19:48:43.079 INFO DAGScheduler - Job 128 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022560 s
19:48:43.087 INFO MemoryStore - Block broadcast_340 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
19:48:43.093 INFO MemoryStore - Block broadcast_340_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:48:43.093 INFO BlockManagerInfo - Added broadcast_340_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:43.093 INFO SparkContext - Created broadcast 340 from newAPIHadoopFile at PathSplitSource.java:96
19:48:43.114 INFO MemoryStore - Block broadcast_341 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:43.120 INFO MemoryStore - Block broadcast_341_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:43.120 INFO BlockManagerInfo - Added broadcast_341_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:43.121 INFO SparkContext - Created broadcast 341 from newAPIHadoopFile at PathSplitSource.java:96
19:48:43.140 INFO FileInputFormat - Total input files to process : 1
19:48:43.142 INFO MemoryStore - Block broadcast_342 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
19:48:43.142 INFO MemoryStore - Block broadcast_342_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
19:48:43.142 INFO BlockManagerInfo - Added broadcast_342_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:43.143 INFO SparkContext - Created broadcast 342 from broadcast at ReadsSparkSink.java:133
19:48:43.144 INFO MemoryStore - Block broadcast_343 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
19:48:43.148 INFO BlockManagerInfo - Removed broadcast_341_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:43.148 INFO MemoryStore - Block broadcast_343_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.4 MiB)
19:48:43.149 INFO BlockManagerInfo - Added broadcast_343_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:43.149 INFO BlockManagerInfo - Removed broadcast_337_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.4 MiB)
19:48:43.149 INFO SparkContext - Created broadcast 343 from broadcast at BamSink.java:76
19:48:43.149 INFO BlockManagerInfo - Removed broadcast_338_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:43.150 INFO BlockManagerInfo - Removed broadcast_329_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:43.150 INFO BlockManagerInfo - Removed broadcast_332_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:43.151 INFO BlockManagerInfo - Removed broadcast_334_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.7 MiB)
19:48:43.151 INFO BlockManagerInfo - Removed broadcast_331_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.7 MiB)
19:48:43.151 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:43.151 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:43.151 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:43.151 INFO BlockManagerInfo - Removed broadcast_335_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.7 MiB)
19:48:43.152 INFO BlockManagerInfo - Removed broadcast_339_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.7 MiB)
19:48:43.152 INFO BlockManagerInfo - Removed broadcast_336_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.8 MiB)
19:48:43.153 INFO BlockManagerInfo - Removed broadcast_333_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.9 MiB)
19:48:43.168 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:43.169 INFO DAGScheduler - Registering RDD 828 (mapToPair at SparkUtils.java:161) as input to shuffle 35
19:48:43.169 INFO DAGScheduler - Got job 129 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:43.169 INFO DAGScheduler - Final stage: ResultStage 176 (runJob at SparkHadoopWriter.scala:83)
19:48:43.169 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 175)
19:48:43.169 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 175)
19:48:43.169 INFO DAGScheduler - Submitting ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:43.186 INFO MemoryStore - Block broadcast_344 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
19:48:43.187 INFO MemoryStore - Block broadcast_344_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
19:48:43.187 INFO BlockManagerInfo - Added broadcast_344_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.8 MiB)
19:48:43.188 INFO SparkContext - Created broadcast 344 from broadcast at DAGScheduler.scala:1580
19:48:43.188 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:43.188 INFO TaskSchedulerImpl - Adding task set 175.0 with 1 tasks resource profile 0
19:48:43.188 INFO TaskSetManager - Starting task 0.0 in stage 175.0 (TID 231) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:43.189 INFO Executor - Running task 0.0 in stage 175.0 (TID 231)
19:48:43.218 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:43.235 INFO Executor - Finished task 0.0 in stage 175.0 (TID 231). 1148 bytes result sent to driver
19:48:43.235 INFO TaskSetManager - Finished task 0.0 in stage 175.0 (TID 231) in 47 ms on localhost (executor driver) (1/1)
19:48:43.235 INFO TaskSchedulerImpl - Removed TaskSet 175.0, whose tasks have all completed, from pool
19:48:43.236 INFO DAGScheduler - ShuffleMapStage 175 (mapToPair at SparkUtils.java:161) finished in 0.067 s
19:48:43.236 INFO DAGScheduler - looking for newly runnable stages
19:48:43.236 INFO DAGScheduler - running: HashSet()
19:48:43.236 INFO DAGScheduler - waiting: HashSet(ResultStage 176)
19:48:43.236 INFO DAGScheduler - failed: HashSet()
19:48:43.236 INFO DAGScheduler - Submitting ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91), which has no missing parents
19:48:43.242 INFO MemoryStore - Block broadcast_345 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
19:48:43.243 INFO MemoryStore - Block broadcast_345_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
19:48:43.243 INFO BlockManagerInfo - Added broadcast_345_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.7 MiB)
19:48:43.243 INFO SparkContext - Created broadcast 345 from broadcast at DAGScheduler.scala:1580
19:48:43.244 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:43.244 INFO TaskSchedulerImpl - Adding task set 176.0 with 1 tasks resource profile 0
19:48:43.244 INFO TaskSetManager - Starting task 0.0 in stage 176.0 (TID 232) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:43.244 INFO Executor - Running task 0.0 in stage 176.0 (TID 232)
19:48:43.248 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:43.248 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:43.259 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:43.259 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:43.259 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:43.260 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:43.260 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:43.260 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:43.283 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948434349132018975119552_0833_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace943944438173980000/_temporary/0/task_202507151948434349132018975119552_0833_r_000000
19:48:43.283 INFO SparkHadoopMapRedUtil - attempt_202507151948434349132018975119552_0833_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:43.284 INFO Executor - Finished task 0.0 in stage 176.0 (TID 232). 1858 bytes result sent to driver
19:48:43.284 INFO TaskSetManager - Finished task 0.0 in stage 176.0 (TID 232) in 40 ms on localhost (executor driver) (1/1)
19:48:43.284 INFO TaskSchedulerImpl - Removed TaskSet 176.0, whose tasks have all completed, from pool
19:48:43.285 INFO DAGScheduler - ResultStage 176 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
19:48:43.285 INFO DAGScheduler - Job 129 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.285 INFO TaskSchedulerImpl - Killing all running tasks in stage 176: Stage finished
19:48:43.285 INFO DAGScheduler - Job 129 finished: runJob at SparkHadoopWriter.scala:83, took 0.116982 s
19:48:43.285 INFO SparkHadoopWriter - Start to commit write Job job_202507151948434349132018975119552_0833.
19:48:43.290 INFO SparkHadoopWriter - Write Job job_202507151948434349132018975119552_0833 committed. Elapsed time: 4 ms.
19:48:43.302 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest110160836768213692359.bam
19:48:43.306 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest110160836768213692359.bam done
19:48:43.306 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace943944438173980000 to /tmp/ReadsSparkSinkUnitTest110160836768213692359.bam.sbi
19:48:43.311 INFO IndexFileMerger - Done merging .sbi files
19:48:43.311 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace943944438173980000 to /tmp/ReadsSparkSinkUnitTest110160836768213692359.bam.bai
19:48:43.315 INFO IndexFileMerger - Done merging .bai files
19:48:43.317 INFO MemoryStore - Block broadcast_346 stored as values in memory (estimated size 13.3 KiB, free 1918.3 MiB)
19:48:43.318 INFO MemoryStore - Block broadcast_346_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.3 MiB)
19:48:43.318 INFO BlockManagerInfo - Added broadcast_346_piece0 in memory on localhost:36125 (size: 8.3 KiB, free: 1919.7 MiB)
19:48:43.318 INFO SparkContext - Created broadcast 346 from broadcast at BamSource.java:104
19:48:43.319 INFO MemoryStore - Block broadcast_347 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
19:48:43.325 INFO MemoryStore - Block broadcast_347_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:43.325 INFO BlockManagerInfo - Added broadcast_347_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:43.325 INFO SparkContext - Created broadcast 347 from newAPIHadoopFile at PathSplitSource.java:96
19:48:43.334 INFO FileInputFormat - Total input files to process : 1
19:48:43.348 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:43.348 INFO DAGScheduler - Got job 130 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:43.348 INFO DAGScheduler - Final stage: ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:43.348 INFO DAGScheduler - Parents of final stage: List()
19:48:43.348 INFO DAGScheduler - Missing parents: List()
19:48:43.348 INFO DAGScheduler - Submitting ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.356 INFO MemoryStore - Block broadcast_348 stored as values in memory (estimated size 148.2 KiB, free 1917.8 MiB)
19:48:43.357 INFO MemoryStore - Block broadcast_348_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.8 MiB)
19:48:43.357 INFO BlockManagerInfo - Added broadcast_348_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.6 MiB)
19:48:43.357 INFO SparkContext - Created broadcast 348 from broadcast at DAGScheduler.scala:1580
19:48:43.357 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.357 INFO TaskSchedulerImpl - Adding task set 177.0 with 1 tasks resource profile 0
19:48:43.357 INFO TaskSetManager - Starting task 0.0 in stage 177.0 (TID 233) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:43.358 INFO Executor - Running task 0.0 in stage 177.0 (TID 233)
19:48:43.372 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest110160836768213692359.bam:0+237038
19:48:43.376 INFO Executor - Finished task 0.0 in stage 177.0 (TID 233). 651483 bytes result sent to driver
19:48:43.378 INFO TaskSetManager - Finished task 0.0 in stage 177.0 (TID 233) in 21 ms on localhost (executor driver) (1/1)
19:48:43.378 INFO TaskSchedulerImpl - Removed TaskSet 177.0, whose tasks have all completed, from pool
19:48:43.378 INFO DAGScheduler - ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
19:48:43.378 INFO DAGScheduler - Job 130 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.378 INFO TaskSchedulerImpl - Killing all running tasks in stage 177: Stage finished
19:48:43.378 INFO DAGScheduler - Job 130 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.030155 s
19:48:43.390 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:43.391 INFO DAGScheduler - Got job 131 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:43.391 INFO DAGScheduler - Final stage: ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185)
19:48:43.391 INFO DAGScheduler - Parents of final stage: List()
19:48:43.391 INFO DAGScheduler - Missing parents: List()
19:48:43.391 INFO DAGScheduler - Submitting ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.407 INFO MemoryStore - Block broadcast_349 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:43.409 INFO MemoryStore - Block broadcast_349_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:48:43.409 INFO BlockManagerInfo - Added broadcast_349_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.4 MiB)
19:48:43.409 INFO SparkContext - Created broadcast 349 from broadcast at DAGScheduler.scala:1580
19:48:43.409 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.409 INFO TaskSchedulerImpl - Adding task set 178.0 with 1 tasks resource profile 0
19:48:43.410 INFO TaskSetManager - Starting task 0.0 in stage 178.0 (TID 234) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:43.410 INFO Executor - Running task 0.0 in stage 178.0 (TID 234)
19:48:43.439 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:43.448 INFO Executor - Finished task 0.0 in stage 178.0 (TID 234). 989 bytes result sent to driver
19:48:43.448 INFO TaskSetManager - Finished task 0.0 in stage 178.0 (TID 234) in 39 ms on localhost (executor driver) (1/1)
19:48:43.448 INFO TaskSchedulerImpl - Removed TaskSet 178.0, whose tasks have all completed, from pool
19:48:43.449 INFO DAGScheduler - ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:48:43.449 INFO DAGScheduler - Job 131 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.449 INFO TaskSchedulerImpl - Killing all running tasks in stage 178: Stage finished
19:48:43.449 INFO DAGScheduler - Job 131 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058427 s
19:48:43.452 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:43.452 INFO DAGScheduler - Got job 132 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:43.452 INFO DAGScheduler - Final stage: ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185)
19:48:43.452 INFO DAGScheduler - Parents of final stage: List()
19:48:43.452 INFO DAGScheduler - Missing parents: List()
19:48:43.452 INFO DAGScheduler - Submitting ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.458 INFO MemoryStore - Block broadcast_350 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:48:43.459 INFO MemoryStore - Block broadcast_350_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.0 MiB)
19:48:43.459 INFO BlockManagerInfo - Added broadcast_350_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.4 MiB)
19:48:43.459 INFO SparkContext - Created broadcast 350 from broadcast at DAGScheduler.scala:1580
19:48:43.460 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.460 INFO TaskSchedulerImpl - Adding task set 179.0 with 1 tasks resource profile 0
19:48:43.460 INFO TaskSetManager - Starting task 0.0 in stage 179.0 (TID 235) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:43.460 INFO Executor - Running task 0.0 in stage 179.0 (TID 235)
19:48:43.471 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest110160836768213692359.bam:0+237038
19:48:43.474 INFO Executor - Finished task 0.0 in stage 179.0 (TID 235). 989 bytes result sent to driver
19:48:43.474 INFO TaskSetManager - Finished task 0.0 in stage 179.0 (TID 235) in 14 ms on localhost (executor driver) (1/1)
19:48:43.474 INFO TaskSchedulerImpl - Removed TaskSet 179.0, whose tasks have all completed, from pool
19:48:43.474 INFO DAGScheduler - ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.021 s
19:48:43.474 INFO DAGScheduler - Job 132 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.474 INFO TaskSchedulerImpl - Killing all running tasks in stage 179: Stage finished
19:48:43.474 INFO DAGScheduler - Job 132 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022205 s
19:48:43.482 INFO MemoryStore - Block broadcast_351 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
19:48:43.488 INFO MemoryStore - Block broadcast_351_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:48:43.488 INFO BlockManagerInfo - Added broadcast_351_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:43.488 INFO SparkContext - Created broadcast 351 from newAPIHadoopFile at PathSplitSource.java:96
19:48:43.509 INFO MemoryStore - Block broadcast_352 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:43.515 INFO MemoryStore - Block broadcast_352_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:43.515 INFO BlockManagerInfo - Added broadcast_352_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:43.515 INFO SparkContext - Created broadcast 352 from newAPIHadoopFile at PathSplitSource.java:96
19:48:43.535 INFO FileInputFormat - Total input files to process : 1
19:48:43.536 INFO MemoryStore - Block broadcast_353 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
19:48:43.537 INFO MemoryStore - Block broadcast_353_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
19:48:43.537 INFO BlockManagerInfo - Added broadcast_353_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:43.537 INFO SparkContext - Created broadcast 353 from broadcast at ReadsSparkSink.java:133
19:48:43.538 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:43.538 INFO MemoryStore - Block broadcast_354 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
19:48:43.539 INFO MemoryStore - Block broadcast_354_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
19:48:43.539 INFO BlockManagerInfo - Added broadcast_354_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:43.539 INFO SparkContext - Created broadcast 354 from broadcast at BamSink.java:76
19:48:43.541 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:43.541 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:43.541 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:43.558 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:43.558 INFO DAGScheduler - Registering RDD 853 (mapToPair at SparkUtils.java:161) as input to shuffle 36
19:48:43.558 INFO DAGScheduler - Got job 133 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:43.558 INFO DAGScheduler - Final stage: ResultStage 181 (runJob at SparkHadoopWriter.scala:83)
19:48:43.558 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 180)
19:48:43.558 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 180)
19:48:43.558 INFO DAGScheduler - Submitting ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:43.575 INFO MemoryStore - Block broadcast_355 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
19:48:43.581 INFO BlockManagerInfo - Removed broadcast_346_piece0 on localhost:36125 in memory (size: 8.3 KiB, free: 1919.3 MiB)
19:48:43.582 INFO BlockManagerInfo - Removed broadcast_348_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.3 MiB)
19:48:43.582 INFO MemoryStore - Block broadcast_355_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.6 MiB)
19:48:43.582 INFO BlockManagerInfo - Added broadcast_355_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:43.582 INFO SparkContext - Created broadcast 355 from broadcast at DAGScheduler.scala:1580
19:48:43.582 INFO BlockManagerInfo - Removed broadcast_340_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:43.583 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:43.583 INFO TaskSchedulerImpl - Adding task set 180.0 with 1 tasks resource profile 0
19:48:43.583 INFO BlockManagerInfo - Removed broadcast_352_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:43.583 INFO TaskSetManager - Starting task 0.0 in stage 180.0 (TID 236) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:43.584 INFO Executor - Running task 0.0 in stage 180.0 (TID 236)
19:48:43.584 INFO BlockManagerInfo - Removed broadcast_349_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.4 MiB)
19:48:43.585 INFO BlockManagerInfo - Removed broadcast_350_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.5 MiB)
19:48:43.585 INFO BlockManagerInfo - Removed broadcast_345_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.5 MiB)
19:48:43.586 INFO BlockManagerInfo - Removed broadcast_347_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:43.586 INFO BlockManagerInfo - Removed broadcast_343_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.6 MiB)
19:48:43.587 INFO BlockManagerInfo - Removed broadcast_344_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:43.587 INFO BlockManagerInfo - Removed broadcast_342_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.8 MiB)
19:48:43.615 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:43.633 INFO Executor - Finished task 0.0 in stage 180.0 (TID 236). 1148 bytes result sent to driver
19:48:43.634 INFO TaskSetManager - Finished task 0.0 in stage 180.0 (TID 236) in 51 ms on localhost (executor driver) (1/1)
19:48:43.634 INFO TaskSchedulerImpl - Removed TaskSet 180.0, whose tasks have all completed, from pool
19:48:43.634 INFO DAGScheduler - ShuffleMapStage 180 (mapToPair at SparkUtils.java:161) finished in 0.075 s
19:48:43.634 INFO DAGScheduler - looking for newly runnable stages
19:48:43.634 INFO DAGScheduler - running: HashSet()
19:48:43.634 INFO DAGScheduler - waiting: HashSet(ResultStage 181)
19:48:43.634 INFO DAGScheduler - failed: HashSet()
19:48:43.634 INFO DAGScheduler - Submitting ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91), which has no missing parents
19:48:43.643 INFO MemoryStore - Block broadcast_356 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
19:48:43.644 INFO MemoryStore - Block broadcast_356_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
19:48:43.644 INFO BlockManagerInfo - Added broadcast_356_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.7 MiB)
19:48:43.644 INFO SparkContext - Created broadcast 356 from broadcast at DAGScheduler.scala:1580
19:48:43.644 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:43.644 INFO TaskSchedulerImpl - Adding task set 181.0 with 1 tasks resource profile 0
19:48:43.645 INFO TaskSetManager - Starting task 0.0 in stage 181.0 (TID 237) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:43.645 INFO Executor - Running task 0.0 in stage 181.0 (TID 237)
19:48:43.651 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:43.652 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:43.667 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:43.667 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:43.667 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:43.667 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:43.667 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:43.667 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:43.686 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948434829392600277118488_0858_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace8176719030173446975/_temporary/0/task_202507151948434829392600277118488_0858_r_000000
19:48:43.686 INFO SparkHadoopMapRedUtil - attempt_202507151948434829392600277118488_0858_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:43.687 INFO Executor - Finished task 0.0 in stage 181.0 (TID 237). 1858 bytes result sent to driver
19:48:43.687 INFO TaskSetManager - Finished task 0.0 in stage 181.0 (TID 237) in 42 ms on localhost (executor driver) (1/1)
19:48:43.687 INFO TaskSchedulerImpl - Removed TaskSet 181.0, whose tasks have all completed, from pool
19:48:43.687 INFO DAGScheduler - ResultStage 181 (runJob at SparkHadoopWriter.scala:83) finished in 0.053 s
19:48:43.687 INFO DAGScheduler - Job 133 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.687 INFO TaskSchedulerImpl - Killing all running tasks in stage 181: Stage finished
19:48:43.687 INFO DAGScheduler - Job 133 finished: runJob at SparkHadoopWriter.scala:83, took 0.129706 s
19:48:43.688 INFO SparkHadoopWriter - Start to commit write Job job_202507151948434829392600277118488_0858.
19:48:43.692 INFO SparkHadoopWriter - Write Job job_202507151948434829392600277118488_0858 committed. Elapsed time: 4 ms.
19:48:43.703 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest16038410326423865019.bam
19:48:43.707 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest16038410326423865019.bam done
19:48:43.707 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace8176719030173446975 to /tmp/ReadsSparkSinkUnitTest16038410326423865019.bam.bai
19:48:43.712 INFO IndexFileMerger - Done merging .bai files
19:48:43.715 INFO MemoryStore - Block broadcast_357 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:43.721 INFO MemoryStore - Block broadcast_357_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:43.721 INFO BlockManagerInfo - Added broadcast_357_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:43.721 INFO SparkContext - Created broadcast 357 from newAPIHadoopFile at PathSplitSource.java:96
19:48:43.741 INFO FileInputFormat - Total input files to process : 1
19:48:43.776 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:43.776 INFO DAGScheduler - Got job 134 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:43.776 INFO DAGScheduler - Final stage: ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:43.776 INFO DAGScheduler - Parents of final stage: List()
19:48:43.776 INFO DAGScheduler - Missing parents: List()
19:48:43.777 INFO DAGScheduler - Submitting ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.793 INFO MemoryStore - Block broadcast_358 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
19:48:43.794 INFO MemoryStore - Block broadcast_358_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
19:48:43.794 INFO BlockManagerInfo - Added broadcast_358_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:43.795 INFO SparkContext - Created broadcast 358 from broadcast at DAGScheduler.scala:1580
19:48:43.795 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.795 INFO TaskSchedulerImpl - Adding task set 182.0 with 1 tasks resource profile 0
19:48:43.795 INFO TaskSetManager - Starting task 0.0 in stage 182.0 (TID 238) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:43.796 INFO Executor - Running task 0.0 in stage 182.0 (TID 238)
19:48:43.824 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest16038410326423865019.bam:0+237038
19:48:43.836 INFO Executor - Finished task 0.0 in stage 182.0 (TID 238). 651483 bytes result sent to driver
19:48:43.838 INFO TaskSetManager - Finished task 0.0 in stage 182.0 (TID 238) in 43 ms on localhost (executor driver) (1/1)
19:48:43.838 INFO TaskSchedulerImpl - Removed TaskSet 182.0, whose tasks have all completed, from pool
19:48:43.838 INFO DAGScheduler - ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.061 s
19:48:43.838 INFO DAGScheduler - Job 134 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.838 INFO TaskSchedulerImpl - Killing all running tasks in stage 182: Stage finished
19:48:43.838 INFO DAGScheduler - Job 134 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.062009 s
19:48:43.847 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:43.847 INFO DAGScheduler - Got job 135 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:43.847 INFO DAGScheduler - Final stage: ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185)
19:48:43.847 INFO DAGScheduler - Parents of final stage: List()
19:48:43.847 INFO DAGScheduler - Missing parents: List()
19:48:43.848 INFO DAGScheduler - Submitting ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.868 INFO MemoryStore - Block broadcast_359 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
19:48:43.869 INFO MemoryStore - Block broadcast_359_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
19:48:43.869 INFO BlockManagerInfo - Added broadcast_359_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.4 MiB)
19:48:43.869 INFO SparkContext - Created broadcast 359 from broadcast at DAGScheduler.scala:1580
19:48:43.870 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.870 INFO TaskSchedulerImpl - Adding task set 183.0 with 1 tasks resource profile 0
19:48:43.870 INFO TaskSetManager - Starting task 0.0 in stage 183.0 (TID 239) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:43.870 INFO Executor - Running task 0.0 in stage 183.0 (TID 239)
19:48:43.899 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:43.909 INFO Executor - Finished task 0.0 in stage 183.0 (TID 239). 989 bytes result sent to driver
19:48:43.909 INFO TaskSetManager - Finished task 0.0 in stage 183.0 (TID 239) in 39 ms on localhost (executor driver) (1/1)
19:48:43.909 INFO TaskSchedulerImpl - Removed TaskSet 183.0, whose tasks have all completed, from pool
19:48:43.909 INFO DAGScheduler - ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.061 s
19:48:43.909 INFO DAGScheduler - Job 135 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.909 INFO TaskSchedulerImpl - Killing all running tasks in stage 183: Stage finished
19:48:43.909 INFO DAGScheduler - Job 135 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062108 s
19:48:43.913 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:43.913 INFO DAGScheduler - Got job 136 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:43.913 INFO DAGScheduler - Final stage: ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185)
19:48:43.913 INFO DAGScheduler - Parents of final stage: List()
19:48:43.913 INFO DAGScheduler - Missing parents: List()
19:48:43.913 INFO DAGScheduler - Submitting ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:43.929 INFO MemoryStore - Block broadcast_360 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
19:48:43.931 INFO MemoryStore - Block broadcast_360_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
19:48:43.931 INFO BlockManagerInfo - Added broadcast_360_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.2 MiB)
19:48:43.931 INFO SparkContext - Created broadcast 360 from broadcast at DAGScheduler.scala:1580
19:48:43.931 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:43.931 INFO TaskSchedulerImpl - Adding task set 184.0 with 1 tasks resource profile 0
19:48:43.931 INFO TaskSetManager - Starting task 0.0 in stage 184.0 (TID 240) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:43.932 INFO Executor - Running task 0.0 in stage 184.0 (TID 240)
19:48:43.959 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest16038410326423865019.bam:0+237038
19:48:43.971 INFO Executor - Finished task 0.0 in stage 184.0 (TID 240). 989 bytes result sent to driver
19:48:43.971 INFO TaskSetManager - Finished task 0.0 in stage 184.0 (TID 240) in 40 ms on localhost (executor driver) (1/1)
19:48:43.971 INFO TaskSchedulerImpl - Removed TaskSet 184.0, whose tasks have all completed, from pool
19:48:43.971 INFO DAGScheduler - ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:48:43.971 INFO DAGScheduler - Job 136 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:43.971 INFO TaskSchedulerImpl - Killing all running tasks in stage 184: Stage finished
19:48:43.971 INFO DAGScheduler - Job 136 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058800 s
19:48:43.980 INFO MemoryStore - Block broadcast_361 stored as values in memory (estimated size 297.9 KiB, free 1916.0 MiB)
19:48:43.986 INFO MemoryStore - Block broadcast_361_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
19:48:43.986 INFO BlockManagerInfo - Added broadcast_361_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.2 MiB)
19:48:43.986 INFO SparkContext - Created broadcast 361 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.012 INFO MemoryStore - Block broadcast_362 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
19:48:44.018 INFO MemoryStore - Block broadcast_362_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
19:48:44.018 INFO BlockManagerInfo - Added broadcast_362_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.1 MiB)
19:48:44.018 INFO SparkContext - Created broadcast 362 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.038 INFO FileInputFormat - Total input files to process : 1
19:48:44.039 INFO MemoryStore - Block broadcast_363 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
19:48:44.040 INFO MemoryStore - Block broadcast_363_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
19:48:44.040 INFO BlockManagerInfo - Added broadcast_363_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:44.040 INFO SparkContext - Created broadcast 363 from broadcast at ReadsSparkSink.java:133
19:48:44.041 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:44.042 INFO MemoryStore - Block broadcast_364 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
19:48:44.043 INFO MemoryStore - Block broadcast_364_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
19:48:44.043 INFO BlockManagerInfo - Added broadcast_364_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:44.043 INFO SparkContext - Created broadcast 364 from broadcast at BamSink.java:76
19:48:44.045 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:44.045 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:44.045 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:44.066 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:44.066 INFO DAGScheduler - Registering RDD 879 (mapToPair at SparkUtils.java:161) as input to shuffle 37
19:48:44.067 INFO DAGScheduler - Got job 137 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:44.067 INFO DAGScheduler - Final stage: ResultStage 186 (runJob at SparkHadoopWriter.scala:83)
19:48:44.067 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 185)
19:48:44.067 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 185)
19:48:44.067 INFO DAGScheduler - Submitting ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:44.084 INFO MemoryStore - Block broadcast_365 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
19:48:44.085 INFO MemoryStore - Block broadcast_365_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
19:48:44.086 INFO BlockManagerInfo - Added broadcast_365_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1918.9 MiB)
19:48:44.086 INFO SparkContext - Created broadcast 365 from broadcast at DAGScheduler.scala:1580
19:48:44.086 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:44.086 INFO TaskSchedulerImpl - Adding task set 185.0 with 1 tasks resource profile 0
19:48:44.086 INFO TaskSetManager - Starting task 0.0 in stage 185.0 (TID 241) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:44.087 INFO Executor - Running task 0.0 in stage 185.0 (TID 241)
19:48:44.116 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:44.132 INFO Executor - Finished task 0.0 in stage 185.0 (TID 241). 1148 bytes result sent to driver
19:48:44.133 INFO TaskSetManager - Finished task 0.0 in stage 185.0 (TID 241) in 47 ms on localhost (executor driver) (1/1)
19:48:44.133 INFO TaskSchedulerImpl - Removed TaskSet 185.0, whose tasks have all completed, from pool
19:48:44.133 INFO DAGScheduler - ShuffleMapStage 185 (mapToPair at SparkUtils.java:161) finished in 0.066 s
19:48:44.133 INFO DAGScheduler - looking for newly runnable stages
19:48:44.133 INFO DAGScheduler - running: HashSet()
19:48:44.133 INFO DAGScheduler - waiting: HashSet(ResultStage 186)
19:48:44.133 INFO DAGScheduler - failed: HashSet()
19:48:44.133 INFO DAGScheduler - Submitting ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91), which has no missing parents
19:48:44.144 INFO MemoryStore - Block broadcast_366 stored as values in memory (estimated size 241.4 KiB, free 1914.4 MiB)
19:48:44.149 INFO BlockManagerInfo - Removed broadcast_356_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.0 MiB)
19:48:44.149 INFO MemoryStore - Block broadcast_366_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1914.6 MiB)
19:48:44.149 INFO BlockManagerInfo - Added broadcast_366_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1918.9 MiB)
19:48:44.150 INFO BlockManagerInfo - Removed broadcast_358_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.1 MiB)
19:48:44.150 INFO SparkContext - Created broadcast 366 from broadcast at DAGScheduler.scala:1580
19:48:44.150 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:44.150 INFO TaskSchedulerImpl - Adding task set 186.0 with 1 tasks resource profile 0
19:48:44.150 INFO BlockManagerInfo - Removed broadcast_362_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.1 MiB)
19:48:44.151 INFO TaskSetManager - Starting task 0.0 in stage 186.0 (TID 242) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:44.151 INFO BlockManagerInfo - Removed broadcast_351_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:44.151 INFO Executor - Running task 0.0 in stage 186.0 (TID 242)
19:48:44.151 INFO BlockManagerInfo - Removed broadcast_359_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:44.152 INFO BlockManagerInfo - Removed broadcast_357_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:44.153 INFO BlockManagerInfo - Removed broadcast_353_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:44.153 INFO BlockManagerInfo - Removed broadcast_360_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:44.153 INFO BlockManagerInfo - Removed broadcast_354_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:44.154 INFO BlockManagerInfo - Removed broadcast_355_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.7 MiB)
19:48:44.157 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:44.157 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:44.168 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:44.168 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:44.168 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:44.169 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:44.169 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:44.169 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:44.187 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948443363078896532261222_0884_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace13895446637748759236/_temporary/0/task_202507151948443363078896532261222_0884_r_000000
19:48:44.187 INFO SparkHadoopMapRedUtil - attempt_202507151948443363078896532261222_0884_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:44.187 INFO Executor - Finished task 0.0 in stage 186.0 (TID 242). 1858 bytes result sent to driver
19:48:44.187 INFO TaskSetManager - Finished task 0.0 in stage 186.0 (TID 242) in 37 ms on localhost (executor driver) (1/1)
19:48:44.187 INFO TaskSchedulerImpl - Removed TaskSet 186.0, whose tasks have all completed, from pool
19:48:44.188 INFO DAGScheduler - ResultStage 186 (runJob at SparkHadoopWriter.scala:83) finished in 0.053 s
19:48:44.188 INFO DAGScheduler - Job 137 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.188 INFO TaskSchedulerImpl - Killing all running tasks in stage 186: Stage finished
19:48:44.188 INFO DAGScheduler - Job 137 finished: runJob at SparkHadoopWriter.scala:83, took 0.121715 s
19:48:44.188 INFO SparkHadoopWriter - Start to commit write Job job_202507151948443363078896532261222_0884.
19:48:44.192 INFO SparkHadoopWriter - Write Job job_202507151948443363078896532261222_0884 committed. Elapsed time: 4 ms.
19:48:44.203 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest13216551855049311854.bam
19:48:44.207 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest13216551855049311854.bam done
19:48:44.207 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace13895446637748759236 to /tmp/ReadsSparkSinkUnitTest13216551855049311854.bam.sbi
19:48:44.211 INFO IndexFileMerger - Done merging .sbi files
19:48:44.213 INFO MemoryStore - Block broadcast_367 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
19:48:44.214 INFO MemoryStore - Block broadcast_367_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
19:48:44.214 INFO BlockManagerInfo - Added broadcast_367_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:48:44.214 INFO SparkContext - Created broadcast 367 from broadcast at BamSource.java:104
19:48:44.215 INFO MemoryStore - Block broadcast_368 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:44.221 INFO MemoryStore - Block broadcast_368_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:44.221 INFO BlockManagerInfo - Added broadcast_368_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:44.221 INFO SparkContext - Created broadcast 368 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.230 INFO FileInputFormat - Total input files to process : 1
19:48:44.244 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:44.244 INFO DAGScheduler - Got job 138 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:44.244 INFO DAGScheduler - Final stage: ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:44.244 INFO DAGScheduler - Parents of final stage: List()
19:48:44.244 INFO DAGScheduler - Missing parents: List()
19:48:44.245 INFO DAGScheduler - Submitting ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:44.250 INFO MemoryStore - Block broadcast_369 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
19:48:44.251 INFO MemoryStore - Block broadcast_369_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
19:48:44.251 INFO BlockManagerInfo - Added broadcast_369_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:44.251 INFO SparkContext - Created broadcast 369 from broadcast at DAGScheduler.scala:1580
19:48:44.251 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:44.252 INFO TaskSchedulerImpl - Adding task set 187.0 with 1 tasks resource profile 0
19:48:44.252 INFO TaskSetManager - Starting task 0.0 in stage 187.0 (TID 243) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:44.252 INFO Executor - Running task 0.0 in stage 187.0 (TID 243)
19:48:44.264 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13216551855049311854.bam:0+237038
19:48:44.268 INFO Executor - Finished task 0.0 in stage 187.0 (TID 243). 651526 bytes result sent to driver
19:48:44.271 INFO TaskSetManager - Finished task 0.0 in stage 187.0 (TID 243) in 18 ms on localhost (executor driver) (1/1)
19:48:44.271 INFO TaskSchedulerImpl - Removed TaskSet 187.0, whose tasks have all completed, from pool
19:48:44.271 INFO DAGScheduler - ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.026 s
19:48:44.271 INFO DAGScheduler - Job 138 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.271 INFO TaskSchedulerImpl - Killing all running tasks in stage 187: Stage finished
19:48:44.271 INFO DAGScheduler - Job 138 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026756 s
19:48:44.286 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:44.286 INFO DAGScheduler - Got job 139 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:44.286 INFO DAGScheduler - Final stage: ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185)
19:48:44.286 INFO DAGScheduler - Parents of final stage: List()
19:48:44.286 INFO DAGScheduler - Missing parents: List()
19:48:44.286 INFO DAGScheduler - Submitting ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:44.313 INFO MemoryStore - Block broadcast_370 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:48:44.314 INFO MemoryStore - Block broadcast_370_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:48:44.314 INFO BlockManagerInfo - Added broadcast_370_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:44.314 INFO SparkContext - Created broadcast 370 from broadcast at DAGScheduler.scala:1580
19:48:44.315 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:44.315 INFO TaskSchedulerImpl - Adding task set 188.0 with 1 tasks resource profile 0
19:48:44.315 INFO TaskSetManager - Starting task 0.0 in stage 188.0 (TID 244) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:44.315 INFO Executor - Running task 0.0 in stage 188.0 (TID 244)
19:48:44.344 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:44.354 INFO Executor - Finished task 0.0 in stage 188.0 (TID 244). 989 bytes result sent to driver
19:48:44.354 INFO TaskSetManager - Finished task 0.0 in stage 188.0 (TID 244) in 39 ms on localhost (executor driver) (1/1)
19:48:44.354 INFO TaskSchedulerImpl - Removed TaskSet 188.0, whose tasks have all completed, from pool
19:48:44.354 INFO DAGScheduler - ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.067 s
19:48:44.354 INFO DAGScheduler - Job 139 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.354 INFO TaskSchedulerImpl - Killing all running tasks in stage 188: Stage finished
19:48:44.354 INFO DAGScheduler - Job 139 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.068290 s
19:48:44.359 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:44.359 INFO DAGScheduler - Got job 140 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:44.359 INFO DAGScheduler - Final stage: ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185)
19:48:44.359 INFO DAGScheduler - Parents of final stage: List()
19:48:44.359 INFO DAGScheduler - Missing parents: List()
19:48:44.359 INFO DAGScheduler - Submitting ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:44.366 INFO MemoryStore - Block broadcast_371 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:48:44.367 INFO MemoryStore - Block broadcast_371_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
19:48:44.367 INFO BlockManagerInfo - Added broadcast_371_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.4 MiB)
19:48:44.367 INFO SparkContext - Created broadcast 371 from broadcast at DAGScheduler.scala:1580
19:48:44.367 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:44.367 INFO TaskSchedulerImpl - Adding task set 189.0 with 1 tasks resource profile 0
19:48:44.368 INFO TaskSetManager - Starting task 0.0 in stage 189.0 (TID 245) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:44.368 INFO Executor - Running task 0.0 in stage 189.0 (TID 245)
19:48:44.379 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13216551855049311854.bam:0+237038
19:48:44.382 INFO Executor - Finished task 0.0 in stage 189.0 (TID 245). 989 bytes result sent to driver
19:48:44.382 INFO TaskSetManager - Finished task 0.0 in stage 189.0 (TID 245) in 14 ms on localhost (executor driver) (1/1)
19:48:44.382 INFO TaskSchedulerImpl - Removed TaskSet 189.0, whose tasks have all completed, from pool
19:48:44.382 INFO DAGScheduler - ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
19:48:44.383 INFO DAGScheduler - Job 140 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.383 INFO TaskSchedulerImpl - Killing all running tasks in stage 189: Stage finished
19:48:44.383 INFO DAGScheduler - Job 140 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.023779 s
19:48:44.390 INFO MemoryStore - Block broadcast_372 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
19:48:44.396 INFO MemoryStore - Block broadcast_372_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:48:44.396 INFO BlockManagerInfo - Added broadcast_372_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:44.396 INFO SparkContext - Created broadcast 372 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.418 INFO MemoryStore - Block broadcast_373 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
19:48:44.424 INFO MemoryStore - Block broadcast_373_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
19:48:44.424 INFO BlockManagerInfo - Added broadcast_373_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:44.424 INFO SparkContext - Created broadcast 373 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.443 INFO FileInputFormat - Total input files to process : 1
19:48:44.445 INFO MemoryStore - Block broadcast_374 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
19:48:44.446 INFO MemoryStore - Block broadcast_374_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
19:48:44.446 INFO BlockManagerInfo - Added broadcast_374_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:44.446 INFO SparkContext - Created broadcast 374 from broadcast at ReadsSparkSink.java:133
19:48:44.447 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:44.447 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:44.447 INFO MemoryStore - Block broadcast_375 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
19:48:44.448 INFO MemoryStore - Block broadcast_375_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
19:48:44.448 INFO BlockManagerInfo - Added broadcast_375_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:44.448 INFO SparkContext - Created broadcast 375 from broadcast at BamSink.java:76
19:48:44.450 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:44.450 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:44.450 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:44.467 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:44.467 INFO DAGScheduler - Registering RDD 904 (mapToPair at SparkUtils.java:161) as input to shuffle 38
19:48:44.467 INFO DAGScheduler - Got job 141 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:44.468 INFO DAGScheduler - Final stage: ResultStage 191 (runJob at SparkHadoopWriter.scala:83)
19:48:44.468 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 190)
19:48:44.468 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 190)
19:48:44.468 INFO DAGScheduler - Submitting ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:44.485 INFO MemoryStore - Block broadcast_376 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
19:48:44.486 INFO MemoryStore - Block broadcast_376_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
19:48:44.486 INFO BlockManagerInfo - Added broadcast_376_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.1 MiB)
19:48:44.486 INFO SparkContext - Created broadcast 376 from broadcast at DAGScheduler.scala:1580
19:48:44.487 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:44.487 INFO TaskSchedulerImpl - Adding task set 190.0 with 1 tasks resource profile 0
19:48:44.487 INFO TaskSetManager - Starting task 0.0 in stage 190.0 (TID 246) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:44.487 INFO Executor - Running task 0.0 in stage 190.0 (TID 246)
19:48:44.521 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:44.536 INFO Executor - Finished task 0.0 in stage 190.0 (TID 246). 1148 bytes result sent to driver
19:48:44.536 INFO TaskSetManager - Finished task 0.0 in stage 190.0 (TID 246) in 49 ms on localhost (executor driver) (1/1)
19:48:44.537 INFO TaskSchedulerImpl - Removed TaskSet 190.0, whose tasks have all completed, from pool
19:48:44.537 INFO DAGScheduler - ShuffleMapStage 190 (mapToPair at SparkUtils.java:161) finished in 0.069 s
19:48:44.537 INFO DAGScheduler - looking for newly runnable stages
19:48:44.537 INFO DAGScheduler - running: HashSet()
19:48:44.537 INFO DAGScheduler - waiting: HashSet(ResultStage 191)
19:48:44.537 INFO DAGScheduler - failed: HashSet()
19:48:44.537 INFO DAGScheduler - Submitting ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91), which has no missing parents
19:48:44.543 INFO MemoryStore - Block broadcast_377 stored as values in memory (estimated size 241.4 KiB, free 1915.1 MiB)
19:48:44.548 INFO BlockManagerInfo - Removed broadcast_364_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.1 MiB)
19:48:44.548 INFO MemoryStore - Block broadcast_377_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1915.2 MiB)
19:48:44.549 INFO BlockManagerInfo - Added broadcast_377_piece0 in memory on localhost:36125 (size: 66.9 KiB, free: 1919.1 MiB)
19:48:44.549 INFO BlockManagerInfo - Removed broadcast_366_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.1 MiB)
19:48:44.549 INFO SparkContext - Created broadcast 377 from broadcast at DAGScheduler.scala:1580
19:48:44.549 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:44.549 INFO BlockManagerInfo - Removed broadcast_363_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.1 MiB)
19:48:44.549 INFO TaskSchedulerImpl - Adding task set 191.0 with 1 tasks resource profile 0
19:48:44.549 INFO BlockManagerInfo - Removed broadcast_371_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.2 MiB)
19:48:44.550 INFO TaskSetManager - Starting task 0.0 in stage 191.0 (TID 247) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:44.550 INFO BlockManagerInfo - Removed broadcast_369_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.2 MiB)
19:48:44.550 INFO Executor - Running task 0.0 in stage 191.0 (TID 247)
19:48:44.550 INFO BlockManagerInfo - Removed broadcast_365_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.4 MiB)
19:48:44.551 INFO BlockManagerInfo - Removed broadcast_361_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:44.551 INFO BlockManagerInfo - Removed broadcast_370_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:48:44.551 INFO BlockManagerInfo - Removed broadcast_367_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.6 MiB)
19:48:44.552 INFO BlockManagerInfo - Removed broadcast_368_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:44.552 INFO BlockManagerInfo - Removed broadcast_373_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:44.557 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:44.557 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:44.573 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:44.573 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:44.573 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:44.573 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:44.573 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:44.573 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:44.587 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948441874264060409786610_0909_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace575283171456150670/_temporary/0/task_202507151948441874264060409786610_0909_r_000000
19:48:44.587 INFO SparkHadoopMapRedUtil - attempt_202507151948441874264060409786610_0909_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:44.587 INFO Executor - Finished task 0.0 in stage 191.0 (TID 247). 1858 bytes result sent to driver
19:48:44.588 INFO TaskSetManager - Finished task 0.0 in stage 191.0 (TID 247) in 38 ms on localhost (executor driver) (1/1)
19:48:44.588 INFO TaskSchedulerImpl - Removed TaskSet 191.0, whose tasks have all completed, from pool
19:48:44.588 INFO DAGScheduler - ResultStage 191 (runJob at SparkHadoopWriter.scala:83) finished in 0.051 s
19:48:44.588 INFO DAGScheduler - Job 141 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.588 INFO TaskSchedulerImpl - Killing all running tasks in stage 191: Stage finished
19:48:44.588 INFO DAGScheduler - Job 141 finished: runJob at SparkHadoopWriter.scala:83, took 0.120993 s
19:48:44.588 INFO SparkHadoopWriter - Start to commit write Job job_202507151948441874264060409786610_0909.
19:48:44.592 INFO SparkHadoopWriter - Write Job job_202507151948441874264060409786610_0909 committed. Elapsed time: 4 ms.
19:48:44.603 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest115800691309756215593.bam
19:48:44.607 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest115800691309756215593.bam done
19:48:44.609 INFO MemoryStore - Block broadcast_378 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:44.616 INFO MemoryStore - Block broadcast_378_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:44.616 INFO BlockManagerInfo - Added broadcast_378_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:44.616 INFO SparkContext - Created broadcast 378 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.636 INFO FileInputFormat - Total input files to process : 1
19:48:44.671 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:44.671 INFO DAGScheduler - Got job 142 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:44.671 INFO DAGScheduler - Final stage: ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:44.671 INFO DAGScheduler - Parents of final stage: List()
19:48:44.671 INFO DAGScheduler - Missing parents: List()
19:48:44.671 INFO DAGScheduler - Submitting ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:44.688 INFO MemoryStore - Block broadcast_379 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
19:48:44.689 INFO MemoryStore - Block broadcast_379_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
19:48:44.689 INFO BlockManagerInfo - Added broadcast_379_piece0 in memory on localhost:36125 (size: 153.7 KiB, free: 1919.5 MiB)
19:48:44.689 INFO SparkContext - Created broadcast 379 from broadcast at DAGScheduler.scala:1580
19:48:44.689 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:44.689 INFO TaskSchedulerImpl - Adding task set 192.0 with 1 tasks resource profile 0
19:48:44.690 INFO TaskSetManager - Starting task 0.0 in stage 192.0 (TID 248) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:44.690 INFO Executor - Running task 0.0 in stage 192.0 (TID 248)
19:48:44.719 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115800691309756215593.bam:0+237038
19:48:44.731 INFO Executor - Finished task 0.0 in stage 192.0 (TID 248). 651483 bytes result sent to driver
19:48:44.732 INFO TaskSetManager - Finished task 0.0 in stage 192.0 (TID 248) in 42 ms on localhost (executor driver) (1/1)
19:48:44.732 INFO TaskSchedulerImpl - Removed TaskSet 192.0, whose tasks have all completed, from pool
19:48:44.732 INFO DAGScheduler - ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.061 s
19:48:44.732 INFO DAGScheduler - Job 142 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.732 INFO TaskSchedulerImpl - Killing all running tasks in stage 192: Stage finished
19:48:44.732 INFO DAGScheduler - Job 142 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.061886 s
19:48:44.742 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:44.742 INFO DAGScheduler - Got job 143 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:44.742 INFO DAGScheduler - Final stage: ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185)
19:48:44.742 INFO DAGScheduler - Parents of final stage: List()
19:48:44.742 INFO DAGScheduler - Missing parents: List()
19:48:44.742 INFO DAGScheduler - Submitting ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:44.759 INFO MemoryStore - Block broadcast_380 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
19:48:44.760 INFO MemoryStore - Block broadcast_380_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
19:48:44.760 INFO BlockManagerInfo - Added broadcast_380_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.4 MiB)
19:48:44.760 INFO SparkContext - Created broadcast 380 from broadcast at DAGScheduler.scala:1580
19:48:44.760 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:44.760 INFO TaskSchedulerImpl - Adding task set 193.0 with 1 tasks resource profile 0
19:48:44.761 INFO TaskSetManager - Starting task 0.0 in stage 193.0 (TID 249) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:44.761 INFO Executor - Running task 0.0 in stage 193.0 (TID 249)
19:48:44.790 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:44.799 INFO Executor - Finished task 0.0 in stage 193.0 (TID 249). 989 bytes result sent to driver
19:48:44.799 INFO TaskSetManager - Finished task 0.0 in stage 193.0 (TID 249) in 38 ms on localhost (executor driver) (1/1)
19:48:44.799 INFO TaskSchedulerImpl - Removed TaskSet 193.0, whose tasks have all completed, from pool
19:48:44.800 INFO DAGScheduler - ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
19:48:44.800 INFO DAGScheduler - Job 143 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.800 INFO TaskSchedulerImpl - Killing all running tasks in stage 193: Stage finished
19:48:44.800 INFO DAGScheduler - Job 143 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057986 s
19:48:44.803 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:44.803 INFO DAGScheduler - Got job 144 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:44.803 INFO DAGScheduler - Final stage: ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185)
19:48:44.803 INFO DAGScheduler - Parents of final stage: List()
19:48:44.803 INFO DAGScheduler - Missing parents: List()
19:48:44.803 INFO DAGScheduler - Submitting ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:44.826 INFO MemoryStore - Block broadcast_381 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
19:48:44.827 INFO MemoryStore - Block broadcast_381_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
19:48:44.827 INFO BlockManagerInfo - Added broadcast_381_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.2 MiB)
19:48:44.828 INFO SparkContext - Created broadcast 381 from broadcast at DAGScheduler.scala:1580
19:48:44.828 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:44.828 INFO TaskSchedulerImpl - Adding task set 194.0 with 1 tasks resource profile 0
19:48:44.828 INFO TaskSetManager - Starting task 0.0 in stage 194.0 (TID 250) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:44.828 INFO Executor - Running task 0.0 in stage 194.0 (TID 250)
19:48:44.857 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115800691309756215593.bam:0+237038
19:48:44.869 INFO Executor - Finished task 0.0 in stage 194.0 (TID 250). 989 bytes result sent to driver
19:48:44.869 INFO TaskSetManager - Finished task 0.0 in stage 194.0 (TID 250) in 41 ms on localhost (executor driver) (1/1)
19:48:44.869 INFO TaskSchedulerImpl - Removed TaskSet 194.0, whose tasks have all completed, from pool
19:48:44.869 INFO DAGScheduler - ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
19:48:44.869 INFO DAGScheduler - Job 144 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:44.869 INFO TaskSchedulerImpl - Killing all running tasks in stage 194: Stage finished
19:48:44.870 INFO DAGScheduler - Job 144 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066566 s
19:48:44.877 INFO MemoryStore - Block broadcast_382 stored as values in memory (estimated size 298.0 KiB, free 1916.0 MiB)
19:48:44.884 INFO MemoryStore - Block broadcast_382_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.0 MiB)
19:48:44.884 INFO BlockManagerInfo - Added broadcast_382_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.2 MiB)
19:48:44.884 INFO SparkContext - Created broadcast 382 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.905 INFO MemoryStore - Block broadcast_383 stored as values in memory (estimated size 298.0 KiB, free 1915.7 MiB)
19:48:44.911 INFO MemoryStore - Block broadcast_383_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1915.6 MiB)
19:48:44.912 INFO BlockManagerInfo - Added broadcast_383_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.1 MiB)
19:48:44.912 INFO SparkContext - Created broadcast 383 from newAPIHadoopFile at PathSplitSource.java:96
19:48:44.932 INFO FileInputFormat - Total input files to process : 1
19:48:44.934 INFO MemoryStore - Block broadcast_384 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
19:48:44.935 INFO MemoryStore - Block broadcast_384_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
19:48:44.935 INFO BlockManagerInfo - Added broadcast_384_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:44.935 INFO SparkContext - Created broadcast 384 from broadcast at ReadsSparkSink.java:133
19:48:44.936 INFO MemoryStore - Block broadcast_385 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
19:48:44.937 INFO MemoryStore - Block broadcast_385_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
19:48:44.937 INFO BlockManagerInfo - Added broadcast_385_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.1 MiB)
19:48:44.937 INFO SparkContext - Created broadcast 385 from broadcast at BamSink.java:76
19:48:44.939 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:44.939 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:44.939 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:44.955 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:44.956 INFO DAGScheduler - Registering RDD 930 (mapToPair at SparkUtils.java:161) as input to shuffle 39
19:48:44.956 INFO DAGScheduler - Got job 145 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:44.956 INFO DAGScheduler - Final stage: ResultStage 196 (runJob at SparkHadoopWriter.scala:83)
19:48:44.956 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 195)
19:48:44.956 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 195)
19:48:44.956 INFO DAGScheduler - Submitting ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:44.973 INFO MemoryStore - Block broadcast_386 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
19:48:44.974 INFO MemoryStore - Block broadcast_386_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
19:48:44.974 INFO BlockManagerInfo - Added broadcast_386_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1918.9 MiB)
19:48:44.975 INFO SparkContext - Created broadcast 386 from broadcast at DAGScheduler.scala:1580
19:48:44.975 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:44.975 INFO TaskSchedulerImpl - Adding task set 195.0 with 1 tasks resource profile 0
19:48:44.975 INFO TaskSetManager - Starting task 0.0 in stage 195.0 (TID 251) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
19:48:44.976 INFO Executor - Running task 0.0 in stage 195.0 (TID 251)
19:48:45.008 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:45.025 INFO Executor - Finished task 0.0 in stage 195.0 (TID 251). 1148 bytes result sent to driver
19:48:45.025 INFO TaskSetManager - Finished task 0.0 in stage 195.0 (TID 251) in 50 ms on localhost (executor driver) (1/1)
19:48:45.025 INFO TaskSchedulerImpl - Removed TaskSet 195.0, whose tasks have all completed, from pool
19:48:45.026 INFO DAGScheduler - ShuffleMapStage 195 (mapToPair at SparkUtils.java:161) finished in 0.070 s
19:48:45.026 INFO DAGScheduler - looking for newly runnable stages
19:48:45.026 INFO DAGScheduler - running: HashSet()
19:48:45.026 INFO DAGScheduler - waiting: HashSet(ResultStage 196)
19:48:45.026 INFO DAGScheduler - failed: HashSet()
19:48:45.026 INFO DAGScheduler - Submitting ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91), which has no missing parents
19:48:45.032 INFO MemoryStore - Block broadcast_387 stored as values in memory (estimated size 241.4 KiB, free 1914.4 MiB)
19:48:45.033 INFO MemoryStore - Block broadcast_387_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1914.3 MiB)
19:48:45.033 INFO BlockManagerInfo - Added broadcast_387_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1918.9 MiB)
19:48:45.034 INFO SparkContext - Created broadcast 387 from broadcast at DAGScheduler.scala:1580
19:48:45.034 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:45.034 INFO TaskSchedulerImpl - Adding task set 196.0 with 1 tasks resource profile 0
19:48:45.034 INFO TaskSetManager - Starting task 0.0 in stage 196.0 (TID 252) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:45.034 INFO Executor - Running task 0.0 in stage 196.0 (TID 252)
19:48:45.038 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:45.039 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:45.049 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.049 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.049 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.049 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.049 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.049 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.072 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948441578227397922533265_0935_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest2.someOtherPlace6290307548798661654/_temporary/0/task_202507151948441578227397922533265_0935_r_000000
19:48:45.072 INFO SparkHadoopMapRedUtil - attempt_202507151948441578227397922533265_0935_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:45.073 INFO Executor - Finished task 0.0 in stage 196.0 (TID 252). 1858 bytes result sent to driver
19:48:45.073 INFO TaskSetManager - Finished task 0.0 in stage 196.0 (TID 252) in 39 ms on localhost (executor driver) (1/1)
19:48:45.073 INFO TaskSchedulerImpl - Removed TaskSet 196.0, whose tasks have all completed, from pool
19:48:45.073 INFO DAGScheduler - ResultStage 196 (runJob at SparkHadoopWriter.scala:83) finished in 0.047 s
19:48:45.073 INFO DAGScheduler - Job 145 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.073 INFO TaskSchedulerImpl - Killing all running tasks in stage 196: Stage finished
19:48:45.074 INFO DAGScheduler - Job 145 finished: runJob at SparkHadoopWriter.scala:83, took 0.118273 s
19:48:45.074 INFO SparkHadoopWriter - Start to commit write Job job_202507151948441578227397922533265_0935.
19:48:45.078 INFO SparkHadoopWriter - Write Job job_202507151948441578227397922533265_0935 committed. Elapsed time: 4 ms.
19:48:45.089 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest24836477789722702023.bam
19:48:45.093 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest24836477789722702023.bam done
19:48:45.094 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace6290307548798661654 to /tmp/ReadsSparkSinkUnitTest24836477789722702023.bam.sbi
19:48:45.098 INFO IndexFileMerger - Done merging .sbi files
19:48:45.098 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace6290307548798661654 to /tmp/ReadsSparkSinkUnitTest24836477789722702023.bam.bai
19:48:45.103 INFO IndexFileMerger - Done merging .bai files
19:48:45.104 INFO MemoryStore - Block broadcast_388 stored as values in memory (estimated size 320.0 B, free 1914.3 MiB)
19:48:45.109 INFO MemoryStore - Block broadcast_388_piece0 stored as bytes in memory (estimated size 233.0 B, free 1914.3 MiB)
19:48:45.109 INFO BlockManagerInfo - Added broadcast_388_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1918.9 MiB)
19:48:45.109 INFO BlockManagerInfo - Removed broadcast_387_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1918.9 MiB)
19:48:45.109 INFO SparkContext - Created broadcast 388 from broadcast at BamSource.java:104
19:48:45.110 INFO BlockManagerInfo - Removed broadcast_380_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.1 MiB)
19:48:45.110 INFO BlockManagerInfo - Removed broadcast_386_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.2 MiB)
19:48:45.110 INFO MemoryStore - Block broadcast_389 stored as values in memory (estimated size 297.9 KiB, free 1915.6 MiB)
19:48:45.110 INFO BlockManagerInfo - Removed broadcast_376_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.4 MiB)
19:48:45.111 INFO BlockManagerInfo - Removed broadcast_378_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:45.112 INFO BlockManagerInfo - Removed broadcast_379_piece0 on localhost:36125 in memory (size: 153.7 KiB, free: 1919.6 MiB)
19:48:45.112 INFO BlockManagerInfo - Removed broadcast_383_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.6 MiB)
19:48:45.112 INFO BlockManagerInfo - Removed broadcast_375_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.7 MiB)
19:48:45.113 INFO BlockManagerInfo - Removed broadcast_384_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.7 MiB)
19:48:45.113 INFO BlockManagerInfo - Removed broadcast_374_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.7 MiB)
19:48:45.114 INFO BlockManagerInfo - Removed broadcast_372_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:45.114 INFO BlockManagerInfo - Removed broadcast_381_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.9 MiB)
19:48:45.115 INFO BlockManagerInfo - Removed broadcast_385_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:45.115 INFO BlockManagerInfo - Removed broadcast_377_piece0 on localhost:36125 in memory (size: 66.9 KiB, free: 1920.0 MiB)
19:48:45.119 INFO MemoryStore - Block broadcast_389_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:45.119 INFO BlockManagerInfo - Added broadcast_389_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:45.119 INFO SparkContext - Created broadcast 389 from newAPIHadoopFile at PathSplitSource.java:96
19:48:45.127 INFO FileInputFormat - Total input files to process : 1
19:48:45.142 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:45.142 INFO DAGScheduler - Got job 146 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:45.142 INFO DAGScheduler - Final stage: ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:45.142 INFO DAGScheduler - Parents of final stage: List()
19:48:45.142 INFO DAGScheduler - Missing parents: List()
19:48:45.142 INFO DAGScheduler - Submitting ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.148 INFO MemoryStore - Block broadcast_390 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:45.149 INFO MemoryStore - Block broadcast_390_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:45.149 INFO BlockManagerInfo - Added broadcast_390_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:45.149 INFO SparkContext - Created broadcast 390 from broadcast at DAGScheduler.scala:1580
19:48:45.149 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.149 INFO TaskSchedulerImpl - Adding task set 197.0 with 1 tasks resource profile 0
19:48:45.150 INFO TaskSetManager - Starting task 0.0 in stage 197.0 (TID 253) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:45.150 INFO Executor - Running task 0.0 in stage 197.0 (TID 253)
19:48:45.161 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest24836477789722702023.bam:0+235514
19:48:45.166 INFO Executor - Finished task 0.0 in stage 197.0 (TID 253). 650184 bytes result sent to driver
19:48:45.167 INFO TaskSetManager - Finished task 0.0 in stage 197.0 (TID 253) in 17 ms on localhost (executor driver) (1/1)
19:48:45.167 INFO TaskSchedulerImpl - Removed TaskSet 197.0, whose tasks have all completed, from pool
19:48:45.168 INFO DAGScheduler - ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
19:48:45.168 INFO DAGScheduler - Job 146 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.168 INFO TaskSchedulerImpl - Killing all running tasks in stage 197: Stage finished
19:48:45.168 INFO DAGScheduler - Job 146 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025886 s
19:48:45.178 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:45.178 INFO DAGScheduler - Got job 147 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:45.178 INFO DAGScheduler - Final stage: ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185)
19:48:45.178 INFO DAGScheduler - Parents of final stage: List()
19:48:45.178 INFO DAGScheduler - Missing parents: List()
19:48:45.178 INFO DAGScheduler - Submitting ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.195 INFO MemoryStore - Block broadcast_391 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:45.196 INFO MemoryStore - Block broadcast_391_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
19:48:45.196 INFO BlockManagerInfo - Added broadcast_391_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:45.196 INFO SparkContext - Created broadcast 391 from broadcast at DAGScheduler.scala:1580
19:48:45.196 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.196 INFO TaskSchedulerImpl - Adding task set 198.0 with 1 tasks resource profile 0
19:48:45.197 INFO TaskSetManager - Starting task 0.0 in stage 198.0 (TID 254) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
19:48:45.197 INFO Executor - Running task 0.0 in stage 198.0 (TID 254)
19:48:45.226 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:45.237 INFO Executor - Finished task 0.0 in stage 198.0 (TID 254). 989 bytes result sent to driver
19:48:45.238 INFO TaskSetManager - Finished task 0.0 in stage 198.0 (TID 254) in 41 ms on localhost (executor driver) (1/1)
19:48:45.238 INFO TaskSchedulerImpl - Removed TaskSet 198.0, whose tasks have all completed, from pool
19:48:45.238 INFO DAGScheduler - ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
19:48:45.238 INFO DAGScheduler - Job 147 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.238 INFO TaskSchedulerImpl - Killing all running tasks in stage 198: Stage finished
19:48:45.238 INFO DAGScheduler - Job 147 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060280 s
19:48:45.242 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:45.242 INFO DAGScheduler - Got job 148 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:45.242 INFO DAGScheduler - Final stage: ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185)
19:48:45.242 INFO DAGScheduler - Parents of final stage: List()
19:48:45.242 INFO DAGScheduler - Missing parents: List()
19:48:45.242 INFO DAGScheduler - Submitting ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.248 INFO MemoryStore - Block broadcast_392 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:45.249 INFO MemoryStore - Block broadcast_392_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.4 MiB)
19:48:45.249 INFO BlockManagerInfo - Added broadcast_392_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.6 MiB)
19:48:45.249 INFO SparkContext - Created broadcast 392 from broadcast at DAGScheduler.scala:1580
19:48:45.249 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.249 INFO TaskSchedulerImpl - Adding task set 199.0 with 1 tasks resource profile 0
19:48:45.250 INFO TaskSetManager - Starting task 0.0 in stage 199.0 (TID 255) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:45.250 INFO Executor - Running task 0.0 in stage 199.0 (TID 255)
19:48:45.261 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest24836477789722702023.bam:0+235514
19:48:45.265 INFO Executor - Finished task 0.0 in stage 199.0 (TID 255). 989 bytes result sent to driver
19:48:45.265 INFO TaskSetManager - Finished task 0.0 in stage 199.0 (TID 255) in 16 ms on localhost (executor driver) (1/1)
19:48:45.265 INFO TaskSchedulerImpl - Removed TaskSet 199.0, whose tasks have all completed, from pool
19:48:45.265 INFO DAGScheduler - ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
19:48:45.266 INFO DAGScheduler - Job 148 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.266 INFO TaskSchedulerImpl - Killing all running tasks in stage 199: Stage finished
19:48:45.266 INFO DAGScheduler - Job 148 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.023848 s
19:48:45.274 INFO MemoryStore - Block broadcast_393 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
19:48:45.282 INFO MemoryStore - Block broadcast_393_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:45.282 INFO BlockManagerInfo - Added broadcast_393_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:45.283 INFO SparkContext - Created broadcast 393 from newAPIHadoopFile at PathSplitSource.java:96
19:48:45.303 INFO MemoryStore - Block broadcast_394 stored as values in memory (estimated size 298.0 KiB, free 1917.7 MiB)
19:48:45.309 INFO MemoryStore - Block broadcast_394_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:45.309 INFO BlockManagerInfo - Added broadcast_394_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:45.309 INFO SparkContext - Created broadcast 394 from newAPIHadoopFile at PathSplitSource.java:96
19:48:45.328 INFO FileInputFormat - Total input files to process : 1
19:48:45.330 INFO MemoryStore - Block broadcast_395 stored as values in memory (estimated size 19.6 KiB, free 1917.7 MiB)
19:48:45.330 INFO MemoryStore - Block broadcast_395_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.7 MiB)
19:48:45.330 INFO BlockManagerInfo - Added broadcast_395_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.5 MiB)
19:48:45.330 INFO SparkContext - Created broadcast 395 from broadcast at ReadsSparkSink.java:133
19:48:45.331 INFO MemoryStore - Block broadcast_396 stored as values in memory (estimated size 20.0 KiB, free 1917.6 MiB)
19:48:45.331 INFO MemoryStore - Block broadcast_396_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.6 MiB)
19:48:45.331 INFO BlockManagerInfo - Added broadcast_396_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.5 MiB)
19:48:45.331 INFO SparkContext - Created broadcast 396 from broadcast at BamSink.java:76
19:48:45.334 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.334 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.334 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.350 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:45.350 INFO DAGScheduler - Registering RDD 955 (mapToPair at SparkUtils.java:161) as input to shuffle 40
19:48:45.350 INFO DAGScheduler - Got job 149 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:45.350 INFO DAGScheduler - Final stage: ResultStage 201 (runJob at SparkHadoopWriter.scala:83)
19:48:45.350 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 200)
19:48:45.350 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 200)
19:48:45.351 INFO DAGScheduler - Submitting ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:45.367 INFO MemoryStore - Block broadcast_397 stored as values in memory (estimated size 434.3 KiB, free 1917.2 MiB)
19:48:45.369 INFO MemoryStore - Block broadcast_397_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1917.1 MiB)
19:48:45.369 INFO BlockManagerInfo - Added broadcast_397_piece0 in memory on localhost:36125 (size: 157.6 KiB, free: 1919.4 MiB)
19:48:45.369 INFO SparkContext - Created broadcast 397 from broadcast at DAGScheduler.scala:1580
19:48:45.369 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:45.369 INFO TaskSchedulerImpl - Adding task set 200.0 with 1 tasks resource profile 0
19:48:45.370 INFO TaskSetManager - Starting task 0.0 in stage 200.0 (TID 256) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
19:48:45.370 INFO Executor - Running task 0.0 in stage 200.0 (TID 256)
19:48:45.401 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:45.413 INFO Executor - Finished task 0.0 in stage 200.0 (TID 256). 1148 bytes result sent to driver
19:48:45.414 INFO TaskSetManager - Finished task 0.0 in stage 200.0 (TID 256) in 44 ms on localhost (executor driver) (1/1)
19:48:45.414 INFO TaskSchedulerImpl - Removed TaskSet 200.0, whose tasks have all completed, from pool
19:48:45.414 INFO DAGScheduler - ShuffleMapStage 200 (mapToPair at SparkUtils.java:161) finished in 0.063 s
19:48:45.414 INFO DAGScheduler - looking for newly runnable stages
19:48:45.414 INFO DAGScheduler - running: HashSet()
19:48:45.414 INFO DAGScheduler - waiting: HashSet(ResultStage 201)
19:48:45.414 INFO DAGScheduler - failed: HashSet()
19:48:45.414 INFO DAGScheduler - Submitting ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91), which has no missing parents
19:48:45.421 INFO MemoryStore - Block broadcast_398 stored as values in memory (estimated size 155.3 KiB, free 1916.9 MiB)
19:48:45.421 INFO MemoryStore - Block broadcast_398_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1916.8 MiB)
19:48:45.422 INFO BlockManagerInfo - Added broadcast_398_piece0 in memory on localhost:36125 (size: 58.5 KiB, free: 1919.3 MiB)
19:48:45.422 INFO SparkContext - Created broadcast 398 from broadcast at DAGScheduler.scala:1580
19:48:45.422 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:45.422 INFO TaskSchedulerImpl - Adding task set 201.0 with 1 tasks resource profile 0
19:48:45.422 INFO TaskSetManager - Starting task 0.0 in stage 201.0 (TID 257) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:45.423 INFO Executor - Running task 0.0 in stage 201.0 (TID 257)
19:48:45.427 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:45.427 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:45.437 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.437 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.437 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.437 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.437 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.437 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.459 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194845459550178519219684_0960_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest3.someOtherPlace1648866823885527429/_temporary/0/task_20250715194845459550178519219684_0960_r_000000
19:48:45.459 INFO SparkHadoopMapRedUtil - attempt_20250715194845459550178519219684_0960_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:45.460 INFO Executor - Finished task 0.0 in stage 201.0 (TID 257). 1858 bytes result sent to driver
19:48:45.460 INFO TaskSetManager - Finished task 0.0 in stage 201.0 (TID 257) in 38 ms on localhost (executor driver) (1/1)
19:48:45.460 INFO TaskSchedulerImpl - Removed TaskSet 201.0, whose tasks have all completed, from pool
19:48:45.460 INFO DAGScheduler - ResultStage 201 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
19:48:45.460 INFO DAGScheduler - Job 149 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.460 INFO TaskSchedulerImpl - Killing all running tasks in stage 201: Stage finished
19:48:45.460 INFO DAGScheduler - Job 149 finished: runJob at SparkHadoopWriter.scala:83, took 0.110409 s
19:48:45.461 INFO SparkHadoopWriter - Start to commit write Job job_20250715194845459550178519219684_0960.
19:48:45.465 INFO SparkHadoopWriter - Write Job job_20250715194845459550178519219684_0960 committed. Elapsed time: 4 ms.
19:48:45.475 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest312736268602049795677.bam
19:48:45.479 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest312736268602049795677.bam done
19:48:45.479 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace1648866823885527429 to /tmp/ReadsSparkSinkUnitTest312736268602049795677.bam.sbi
19:48:45.484 INFO IndexFileMerger - Done merging .sbi files
19:48:45.484 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace1648866823885527429 to /tmp/ReadsSparkSinkUnitTest312736268602049795677.bam.bai
19:48:45.489 INFO IndexFileMerger - Done merging .bai files
19:48:45.490 INFO MemoryStore - Block broadcast_399 stored as values in memory (estimated size 312.0 B, free 1916.8 MiB)
19:48:45.494 INFO MemoryStore - Block broadcast_399_piece0 stored as bytes in memory (estimated size 231.0 B, free 1916.8 MiB)
19:48:45.494 INFO BlockManagerInfo - Added broadcast_399_piece0 in memory on localhost:36125 (size: 231.0 B, free: 1919.3 MiB)
19:48:45.495 INFO SparkContext - Created broadcast 399 from broadcast at BamSource.java:104
19:48:45.495 INFO BlockManagerInfo - Removed broadcast_392_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.4 MiB)
19:48:45.495 INFO BlockManagerInfo - Removed broadcast_389_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:45.496 INFO BlockManagerInfo - Removed broadcast_391_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:48:45.496 INFO MemoryStore - Block broadcast_400 stored as values in memory (estimated size 297.9 KiB, free 1917.2 MiB)
19:48:45.496 INFO BlockManagerInfo - Removed broadcast_395_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.6 MiB)
19:48:45.496 INFO BlockManagerInfo - Removed broadcast_396_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.6 MiB)
19:48:45.497 INFO BlockManagerInfo - Removed broadcast_394_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:45.497 INFO BlockManagerInfo - Removed broadcast_390_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.7 MiB)
19:48:45.498 INFO BlockManagerInfo - Removed broadcast_382_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.7 MiB)
19:48:45.498 INFO BlockManagerInfo - Removed broadcast_388_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.7 MiB)
19:48:45.499 INFO BlockManagerInfo - Removed broadcast_397_piece0 on localhost:36125 in memory (size: 157.6 KiB, free: 1919.9 MiB)
19:48:45.499 INFO BlockManagerInfo - Removed broadcast_398_piece0 on localhost:36125 in memory (size: 58.5 KiB, free: 1920.0 MiB)
19:48:45.503 INFO MemoryStore - Block broadcast_400_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:45.503 INFO BlockManagerInfo - Added broadcast_400_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:45.503 INFO SparkContext - Created broadcast 400 from newAPIHadoopFile at PathSplitSource.java:96
19:48:45.512 INFO FileInputFormat - Total input files to process : 1
19:48:45.526 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:45.526 INFO DAGScheduler - Got job 150 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:45.526 INFO DAGScheduler - Final stage: ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:45.526 INFO DAGScheduler - Parents of final stage: List()
19:48:45.527 INFO DAGScheduler - Missing parents: List()
19:48:45.527 INFO DAGScheduler - Submitting ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.532 INFO MemoryStore - Block broadcast_401 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:45.533 INFO MemoryStore - Block broadcast_401_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:45.533 INFO BlockManagerInfo - Added broadcast_401_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:45.533 INFO SparkContext - Created broadcast 401 from broadcast at DAGScheduler.scala:1580
19:48:45.534 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.534 INFO TaskSchedulerImpl - Adding task set 202.0 with 1 tasks resource profile 0
19:48:45.534 INFO TaskSetManager - Starting task 0.0 in stage 202.0 (TID 258) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:45.534 INFO Executor - Running task 0.0 in stage 202.0 (TID 258)
19:48:45.546 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest312736268602049795677.bam:0+236517
19:48:45.550 INFO Executor - Finished task 0.0 in stage 202.0 (TID 258). 749470 bytes result sent to driver
19:48:45.551 INFO TaskSetManager - Finished task 0.0 in stage 202.0 (TID 258) in 17 ms on localhost (executor driver) (1/1)
19:48:45.551 INFO TaskSchedulerImpl - Removed TaskSet 202.0, whose tasks have all completed, from pool
19:48:45.551 INFO DAGScheduler - ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.024 s
19:48:45.551 INFO DAGScheduler - Job 150 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.551 INFO TaskSchedulerImpl - Killing all running tasks in stage 202: Stage finished
19:48:45.551 INFO DAGScheduler - Job 150 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025141 s
19:48:45.562 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:45.562 INFO DAGScheduler - Got job 151 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:45.562 INFO DAGScheduler - Final stage: ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185)
19:48:45.562 INFO DAGScheduler - Parents of final stage: List()
19:48:45.562 INFO DAGScheduler - Missing parents: List()
19:48:45.562 INFO DAGScheduler - Submitting ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.579 INFO MemoryStore - Block broadcast_402 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:45.580 INFO MemoryStore - Block broadcast_402_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
19:48:45.580 INFO BlockManagerInfo - Added broadcast_402_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:45.580 INFO SparkContext - Created broadcast 402 from broadcast at DAGScheduler.scala:1580
19:48:45.580 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.580 INFO TaskSchedulerImpl - Adding task set 203.0 with 1 tasks resource profile 0
19:48:45.581 INFO TaskSetManager - Starting task 0.0 in stage 203.0 (TID 259) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
19:48:45.581 INFO Executor - Running task 0.0 in stage 203.0 (TID 259)
19:48:45.610 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:45.617 INFO Executor - Finished task 0.0 in stage 203.0 (TID 259). 989 bytes result sent to driver
19:48:45.617 INFO TaskSetManager - Finished task 0.0 in stage 203.0 (TID 259) in 36 ms on localhost (executor driver) (1/1)
19:48:45.617 INFO TaskSchedulerImpl - Removed TaskSet 203.0, whose tasks have all completed, from pool
19:48:45.617 INFO DAGScheduler - ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.055 s
19:48:45.617 INFO DAGScheduler - Job 151 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.617 INFO TaskSchedulerImpl - Killing all running tasks in stage 203: Stage finished
19:48:45.617 INFO DAGScheduler - Job 151 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.055878 s
19:48:45.622 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:45.622 INFO DAGScheduler - Got job 152 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:45.622 INFO DAGScheduler - Final stage: ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185)
19:48:45.622 INFO DAGScheduler - Parents of final stage: List()
19:48:45.622 INFO DAGScheduler - Missing parents: List()
19:48:45.622 INFO DAGScheduler - Submitting ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.628 INFO MemoryStore - Block broadcast_403 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:45.629 INFO MemoryStore - Block broadcast_403_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.4 MiB)
19:48:45.629 INFO BlockManagerInfo - Added broadcast_403_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.6 MiB)
19:48:45.629 INFO SparkContext - Created broadcast 403 from broadcast at DAGScheduler.scala:1580
19:48:45.629 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.629 INFO TaskSchedulerImpl - Adding task set 204.0 with 1 tasks resource profile 0
19:48:45.630 INFO TaskSetManager - Starting task 0.0 in stage 204.0 (TID 260) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:45.630 INFO Executor - Running task 0.0 in stage 204.0 (TID 260)
19:48:45.640 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest312736268602049795677.bam:0+236517
19:48:45.643 INFO Executor - Finished task 0.0 in stage 204.0 (TID 260). 989 bytes result sent to driver
19:48:45.643 INFO TaskSetManager - Finished task 0.0 in stage 204.0 (TID 260) in 14 ms on localhost (executor driver) (1/1)
19:48:45.643 INFO TaskSchedulerImpl - Removed TaskSet 204.0, whose tasks have all completed, from pool
19:48:45.643 INFO DAGScheduler - ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.021 s
19:48:45.643 INFO DAGScheduler - Job 152 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.643 INFO TaskSchedulerImpl - Killing all running tasks in stage 204: Stage finished
19:48:45.644 INFO DAGScheduler - Job 152 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.021709 s
19:48:45.651 INFO MemoryStore - Block broadcast_404 stored as values in memory (estimated size 576.0 B, free 1918.4 MiB)
19:48:45.651 INFO MemoryStore - Block broadcast_404_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.4 MiB)
19:48:45.651 INFO BlockManagerInfo - Added broadcast_404_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.6 MiB)
19:48:45.652 INFO SparkContext - Created broadcast 404 from broadcast at CramSource.java:114
19:48:45.653 INFO MemoryStore - Block broadcast_405 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:45.663 INFO MemoryStore - Block broadcast_405_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:45.663 INFO BlockManagerInfo - Added broadcast_405_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:45.663 INFO SparkContext - Created broadcast 405 from newAPIHadoopFile at PathSplitSource.java:96
19:48:45.680 INFO MemoryStore - Block broadcast_406 stored as values in memory (estimated size 576.0 B, free 1918.0 MiB)
19:48:45.680 INFO MemoryStore - Block broadcast_406_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.0 MiB)
19:48:45.680 INFO BlockManagerInfo - Added broadcast_406_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.6 MiB)
19:48:45.680 INFO SparkContext - Created broadcast 406 from broadcast at CramSource.java:114
19:48:45.681 INFO MemoryStore - Block broadcast_407 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
19:48:45.687 INFO MemoryStore - Block broadcast_407_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:45.687 INFO BlockManagerInfo - Added broadcast_407_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:45.687 INFO SparkContext - Created broadcast 407 from newAPIHadoopFile at PathSplitSource.java:96
19:48:45.701 INFO FileInputFormat - Total input files to process : 1
19:48:45.702 INFO MemoryStore - Block broadcast_408 stored as values in memory (estimated size 6.0 KiB, free 1917.7 MiB)
19:48:45.702 INFO MemoryStore - Block broadcast_408_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
19:48:45.702 INFO BlockManagerInfo - Added broadcast_408_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.5 MiB)
19:48:45.703 INFO SparkContext - Created broadcast 408 from broadcast at ReadsSparkSink.java:133
19:48:45.703 INFO MemoryStore - Block broadcast_409 stored as values in memory (estimated size 6.2 KiB, free 1917.7 MiB)
19:48:45.703 INFO MemoryStore - Block broadcast_409_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
19:48:45.703 INFO BlockManagerInfo - Added broadcast_409_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.5 MiB)
19:48:45.704 INFO SparkContext - Created broadcast 409 from broadcast at CramSink.java:76
19:48:45.705 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.705 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.705 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.721 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:45.722 INFO DAGScheduler - Registering RDD 978 (mapToPair at SparkUtils.java:161) as input to shuffle 41
19:48:45.722 INFO DAGScheduler - Got job 153 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:45.722 INFO DAGScheduler - Final stage: ResultStage 206 (runJob at SparkHadoopWriter.scala:83)
19:48:45.722 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 205)
19:48:45.722 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 205)
19:48:45.722 INFO DAGScheduler - Submitting ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:45.735 INFO MemoryStore - Block broadcast_410 stored as values in memory (estimated size 292.8 KiB, free 1917.4 MiB)
19:48:45.736 INFO MemoryStore - Block broadcast_410_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.3 MiB)
19:48:45.736 INFO BlockManagerInfo - Added broadcast_410_piece0 in memory on localhost:36125 (size: 107.3 KiB, free: 1919.4 MiB)
19:48:45.736 INFO SparkContext - Created broadcast 410 from broadcast at DAGScheduler.scala:1580
19:48:45.737 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:45.737 INFO TaskSchedulerImpl - Adding task set 205.0 with 1 tasks resource profile 0
19:48:45.737 INFO TaskSetManager - Starting task 0.0 in stage 205.0 (TID 261) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
19:48:45.737 INFO Executor - Running task 0.0 in stage 205.0 (TID 261)
19:48:45.758 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:45.767 INFO Executor - Finished task 0.0 in stage 205.0 (TID 261). 1148 bytes result sent to driver
19:48:45.768 INFO TaskSetManager - Finished task 0.0 in stage 205.0 (TID 261) in 31 ms on localhost (executor driver) (1/1)
19:48:45.768 INFO TaskSchedulerImpl - Removed TaskSet 205.0, whose tasks have all completed, from pool
19:48:45.768 INFO DAGScheduler - ShuffleMapStage 205 (mapToPair at SparkUtils.java:161) finished in 0.046 s
19:48:45.768 INFO DAGScheduler - looking for newly runnable stages
19:48:45.768 INFO DAGScheduler - running: HashSet()
19:48:45.768 INFO DAGScheduler - waiting: HashSet(ResultStage 206)
19:48:45.768 INFO DAGScheduler - failed: HashSet()
19:48:45.768 INFO DAGScheduler - Submitting ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89), which has no missing parents
19:48:45.779 INFO MemoryStore - Block broadcast_411 stored as values in memory (estimated size 153.2 KiB, free 1917.1 MiB)
19:48:45.784 INFO MemoryStore - Block broadcast_411_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1917.1 MiB)
19:48:45.784 INFO BlockManagerInfo - Removed broadcast_403_piece0 on localhost:36125 in memory (size: 54.5 KiB, free: 1919.5 MiB)
19:48:45.784 INFO BlockManagerInfo - Added broadcast_411_piece0 in memory on localhost:36125 (size: 58.1 KiB, free: 1919.4 MiB)
19:48:45.784 INFO SparkContext - Created broadcast 411 from broadcast at DAGScheduler.scala:1580
19:48:45.784 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
19:48:45.784 INFO TaskSchedulerImpl - Adding task set 206.0 with 1 tasks resource profile 0
19:48:45.785 INFO BlockManagerInfo - Removed broadcast_402_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:48:45.785 INFO TaskSetManager - Starting task 0.0 in stage 206.0 (TID 262) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:45.785 INFO Executor - Running task 0.0 in stage 206.0 (TID 262)
19:48:45.785 INFO BlockManagerInfo - Removed broadcast_400_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:45.786 INFO BlockManagerInfo - Removed broadcast_399_piece0 on localhost:36125 in memory (size: 231.0 B, free: 1919.6 MiB)
19:48:45.786 INFO BlockManagerInfo - Removed broadcast_407_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:45.787 INFO BlockManagerInfo - Removed broadcast_393_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:45.787 INFO BlockManagerInfo - Removed broadcast_401_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.8 MiB)
19:48:45.788 INFO BlockManagerInfo - Removed broadcast_406_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.8 MiB)
19:48:45.790 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:45.790 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:45.796 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.796 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.796 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.796 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:45.796 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:45.796 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:45.833 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948453268519747672898176_0983_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest5.someOtherPlace1205916080753836250/_temporary/0/task_202507151948453268519747672898176_0983_r_000000
19:48:45.833 INFO SparkHadoopMapRedUtil - attempt_202507151948453268519747672898176_0983_r_000000_0: Committed. Elapsed time: 0 ms.
19:48:45.834 INFO Executor - Finished task 0.0 in stage 206.0 (TID 262). 1858 bytes result sent to driver
19:48:45.834 INFO TaskSetManager - Finished task 0.0 in stage 206.0 (TID 262) in 49 ms on localhost (executor driver) (1/1)
19:48:45.834 INFO TaskSchedulerImpl - Removed TaskSet 206.0, whose tasks have all completed, from pool
19:48:45.834 INFO DAGScheduler - ResultStage 206 (runJob at SparkHadoopWriter.scala:83) finished in 0.066 s
19:48:45.834 INFO DAGScheduler - Job 153 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.834 INFO TaskSchedulerImpl - Killing all running tasks in stage 206: Stage finished
19:48:45.834 INFO DAGScheduler - Job 153 finished: runJob at SparkHadoopWriter.scala:83, took 0.112988 s
19:48:45.835 INFO SparkHadoopWriter - Start to commit write Job job_202507151948453268519747672898176_0983.
19:48:45.840 INFO SparkHadoopWriter - Write Job job_202507151948453268519747672898176_0983 committed. Elapsed time: 5 ms.
19:48:45.851 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest53930124499384889397.cram
19:48:45.856 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest53930124499384889397.cram done
19:48:45.857 INFO MemoryStore - Block broadcast_412 stored as values in memory (estimated size 504.0 B, free 1919.0 MiB)
19:48:45.858 INFO MemoryStore - Block broadcast_412_piece0 stored as bytes in memory (estimated size 159.0 B, free 1919.0 MiB)
19:48:45.858 INFO BlockManagerInfo - Added broadcast_412_piece0 in memory on localhost:36125 (size: 159.0 B, free: 1919.8 MiB)
19:48:45.858 INFO SparkContext - Created broadcast 412 from broadcast at CramSource.java:114
19:48:45.859 INFO MemoryStore - Block broadcast_413 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
19:48:45.865 INFO MemoryStore - Block broadcast_413_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
19:48:45.865 INFO BlockManagerInfo - Added broadcast_413_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:45.866 INFO SparkContext - Created broadcast 413 from newAPIHadoopFile at PathSplitSource.java:96
19:48:45.879 INFO FileInputFormat - Total input files to process : 1
19:48:45.904 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:45.904 INFO DAGScheduler - Got job 154 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:45.904 INFO DAGScheduler - Final stage: ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:45.904 INFO DAGScheduler - Parents of final stage: List()
19:48:45.904 INFO DAGScheduler - Missing parents: List()
19:48:45.904 INFO DAGScheduler - Submitting ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.916 INFO MemoryStore - Block broadcast_414 stored as values in memory (estimated size 286.8 KiB, free 1918.4 MiB)
19:48:45.917 INFO MemoryStore - Block broadcast_414_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.3 MiB)
19:48:45.917 INFO BlockManagerInfo - Added broadcast_414_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.6 MiB)
19:48:45.917 INFO SparkContext - Created broadcast 414 from broadcast at DAGScheduler.scala:1580
19:48:45.917 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.917 INFO TaskSchedulerImpl - Adding task set 207.0 with 1 tasks resource profile 0
19:48:45.918 INFO TaskSetManager - Starting task 0.0 in stage 207.0 (TID 263) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:45.918 INFO Executor - Running task 0.0 in stage 207.0 (TID 263)
19:48:45.938 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest53930124499384889397.cram:0+43713
19:48:45.948 INFO Executor - Finished task 0.0 in stage 207.0 (TID 263). 154058 bytes result sent to driver
19:48:45.949 INFO TaskSetManager - Finished task 0.0 in stage 207.0 (TID 263) in 31 ms on localhost (executor driver) (1/1)
19:48:45.949 INFO TaskSchedulerImpl - Removed TaskSet 207.0, whose tasks have all completed, from pool
19:48:45.949 INFO DAGScheduler - ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.044 s
19:48:45.949 INFO DAGScheduler - Job 154 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.949 INFO TaskSchedulerImpl - Killing all running tasks in stage 207: Stage finished
19:48:45.949 INFO DAGScheduler - Job 154 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.044935 s
19:48:45.954 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:45.954 INFO DAGScheduler - Got job 155 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:45.954 INFO DAGScheduler - Final stage: ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185)
19:48:45.954 INFO DAGScheduler - Parents of final stage: List()
19:48:45.955 INFO DAGScheduler - Missing parents: List()
19:48:45.955 INFO DAGScheduler - Submitting ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:45.966 INFO MemoryStore - Block broadcast_415 stored as values in memory (estimated size 286.8 KiB, free 1918.0 MiB)
19:48:45.967 INFO MemoryStore - Block broadcast_415_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.9 MiB)
19:48:45.967 INFO BlockManagerInfo - Added broadcast_415_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.5 MiB)
19:48:45.967 INFO SparkContext - Created broadcast 415 from broadcast at DAGScheduler.scala:1580
19:48:45.967 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:45.967 INFO TaskSchedulerImpl - Adding task set 208.0 with 1 tasks resource profile 0
19:48:45.968 INFO TaskSetManager - Starting task 0.0 in stage 208.0 (TID 264) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
19:48:45.968 INFO Executor - Running task 0.0 in stage 208.0 (TID 264)
19:48:45.988 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:45.994 INFO Executor - Finished task 0.0 in stage 208.0 (TID 264). 989 bytes result sent to driver
19:48:45.994 INFO TaskSetManager - Finished task 0.0 in stage 208.0 (TID 264) in 26 ms on localhost (executor driver) (1/1)
19:48:45.994 INFO TaskSchedulerImpl - Removed TaskSet 208.0, whose tasks have all completed, from pool
19:48:45.994 INFO DAGScheduler - ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.039 s
19:48:45.994 INFO DAGScheduler - Job 155 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:45.994 INFO TaskSchedulerImpl - Killing all running tasks in stage 208: Stage finished
19:48:45.995 INFO DAGScheduler - Job 155 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.040412 s
19:48:45.998 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:45.998 INFO DAGScheduler - Got job 156 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:45.998 INFO DAGScheduler - Final stage: ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185)
19:48:45.998 INFO DAGScheduler - Parents of final stage: List()
19:48:45.998 INFO DAGScheduler - Missing parents: List()
19:48:45.998 INFO DAGScheduler - Submitting ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:46.010 INFO MemoryStore - Block broadcast_416 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
19:48:46.011 INFO MemoryStore - Block broadcast_416_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
19:48:46.011 INFO BlockManagerInfo - Added broadcast_416_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.4 MiB)
19:48:46.011 INFO SparkContext - Created broadcast 416 from broadcast at DAGScheduler.scala:1580
19:48:46.011 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:46.011 INFO TaskSchedulerImpl - Adding task set 209.0 with 1 tasks resource profile 0
19:48:46.012 INFO TaskSetManager - Starting task 0.0 in stage 209.0 (TID 265) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
19:48:46.012 INFO Executor - Running task 0.0 in stage 209.0 (TID 265)
19:48:46.032 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest53930124499384889397.cram:0+43713
19:48:46.041 INFO Executor - Finished task 0.0 in stage 209.0 (TID 265). 989 bytes result sent to driver
19:48:46.041 INFO TaskSetManager - Finished task 0.0 in stage 209.0 (TID 265) in 30 ms on localhost (executor driver) (1/1)
19:48:46.041 INFO TaskSchedulerImpl - Removed TaskSet 209.0, whose tasks have all completed, from pool
19:48:46.041 INFO DAGScheduler - ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.043 s
19:48:46.041 INFO DAGScheduler - Job 156 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:46.041 INFO TaskSchedulerImpl - Killing all running tasks in stage 209: Stage finished
19:48:46.041 INFO DAGScheduler - Job 156 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.043478 s
19:48:46.050 INFO MemoryStore - Block broadcast_417 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
19:48:46.061 INFO MemoryStore - Block broadcast_417_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.2 MiB)
19:48:46.061 INFO BlockManagerInfo - Added broadcast_417_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:46.061 INFO SparkContext - Created broadcast 417 from newAPIHadoopFile at PathSplitSource.java:96
19:48:46.088 INFO MemoryStore - Block broadcast_418 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
19:48:46.094 INFO MemoryStore - Block broadcast_418_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
19:48:46.094 INFO BlockManagerInfo - Added broadcast_418_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:46.094 INFO SparkContext - Created broadcast 418 from newAPIHadoopFile at PathSplitSource.java:96
19:48:46.113 INFO FileInputFormat - Total input files to process : 1
19:48:46.115 INFO MemoryStore - Block broadcast_419 stored as values in memory (estimated size 160.7 KiB, free 1916.7 MiB)
19:48:46.116 INFO MemoryStore - Block broadcast_419_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.7 MiB)
19:48:46.116 INFO BlockManagerInfo - Added broadcast_419_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:46.116 INFO SparkContext - Created broadcast 419 from broadcast at ReadsSparkSink.java:133
19:48:46.119 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:46.119 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:46.119 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:46.135 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:46.136 INFO DAGScheduler - Registering RDD 1003 (mapToPair at SparkUtils.java:161) as input to shuffle 42
19:48:46.136 INFO DAGScheduler - Got job 157 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:46.136 INFO DAGScheduler - Final stage: ResultStage 211 (runJob at SparkHadoopWriter.scala:83)
19:48:46.136 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 210)
19:48:46.136 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 210)
19:48:46.136 INFO DAGScheduler - Submitting ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:46.153 INFO MemoryStore - Block broadcast_420 stored as values in memory (estimated size 520.4 KiB, free 1916.2 MiB)
19:48:46.154 INFO MemoryStore - Block broadcast_420_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
19:48:46.154 INFO BlockManagerInfo - Added broadcast_420_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:46.154 INFO SparkContext - Created broadcast 420 from broadcast at DAGScheduler.scala:1580
19:48:46.155 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:46.155 INFO TaskSchedulerImpl - Adding task set 210.0 with 1 tasks resource profile 0
19:48:46.155 INFO TaskSetManager - Starting task 0.0 in stage 210.0 (TID 266) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:46.155 INFO Executor - Running task 0.0 in stage 210.0 (TID 266)
19:48:46.186 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:46.200 INFO Executor - Finished task 0.0 in stage 210.0 (TID 266). 1148 bytes result sent to driver
19:48:46.200 INFO TaskSetManager - Finished task 0.0 in stage 210.0 (TID 266) in 45 ms on localhost (executor driver) (1/1)
19:48:46.200 INFO TaskSchedulerImpl - Removed TaskSet 210.0, whose tasks have all completed, from pool
19:48:46.201 INFO DAGScheduler - ShuffleMapStage 210 (mapToPair at SparkUtils.java:161) finished in 0.065 s
19:48:46.201 INFO DAGScheduler - looking for newly runnable stages
19:48:46.201 INFO DAGScheduler - running: HashSet()
19:48:46.201 INFO DAGScheduler - waiting: HashSet(ResultStage 211)
19:48:46.201 INFO DAGScheduler - failed: HashSet()
19:48:46.201 INFO DAGScheduler - Submitting ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65), which has no missing parents
19:48:46.207 INFO MemoryStore - Block broadcast_421 stored as values in memory (estimated size 241.1 KiB, free 1915.8 MiB)
19:48:46.208 INFO MemoryStore - Block broadcast_421_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1915.7 MiB)
19:48:46.208 INFO BlockManagerInfo - Added broadcast_421_piece0 in memory on localhost:36125 (size: 66.9 KiB, free: 1919.1 MiB)
19:48:46.208 INFO SparkContext - Created broadcast 421 from broadcast at DAGScheduler.scala:1580
19:48:46.208 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
19:48:46.208 INFO TaskSchedulerImpl - Adding task set 211.0 with 1 tasks resource profile 0
19:48:46.209 INFO TaskSetManager - Starting task 0.0 in stage 211.0 (TID 267) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:46.209 INFO Executor - Running task 0.0 in stage 211.0 (TID 267)
19:48:46.214 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:46.214 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:46.225 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:46.225 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:46.225 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:46.241 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948465326271719382133485_1009_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest6.someOtherPlace827554891467393899/_temporary/0/task_202507151948465326271719382133485_1009_m_000000
19:48:46.241 INFO SparkHadoopMapRedUtil - attempt_202507151948465326271719382133485_1009_m_000000_0: Committed. Elapsed time: 0 ms.
19:48:46.245 INFO Executor - Finished task 0.0 in stage 211.0 (TID 267). 1944 bytes result sent to driver
19:48:46.245 INFO BlockManagerInfo - Removed broadcast_404_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.1 MiB)
19:48:46.246 INFO TaskSetManager - Finished task 0.0 in stage 211.0 (TID 267) in 37 ms on localhost (executor driver) (1/1)
19:48:46.246 INFO TaskSchedulerImpl - Removed TaskSet 211.0, whose tasks have all completed, from pool
19:48:46.246 INFO BlockManagerInfo - Removed broadcast_420_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.3 MiB)
19:48:46.246 INFO DAGScheduler - ResultStage 211 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
19:48:46.246 INFO DAGScheduler - Job 157 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:46.246 INFO TaskSchedulerImpl - Killing all running tasks in stage 211: Stage finished
19:48:46.246 INFO DAGScheduler - Job 157 finished: runJob at SparkHadoopWriter.scala:83, took 0.110858 s
19:48:46.247 INFO BlockManagerInfo - Removed broadcast_413_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:46.247 INFO SparkHadoopWriter - Start to commit write Job job_202507151948465326271719382133485_1009.
19:48:46.247 INFO BlockManagerInfo - Removed broadcast_409_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.3 MiB)
19:48:46.247 INFO BlockManagerInfo - Removed broadcast_414_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.4 MiB)
19:48:46.248 INFO BlockManagerInfo - Removed broadcast_415_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.5 MiB)
19:48:46.248 INFO BlockManagerInfo - Removed broadcast_416_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.6 MiB)
19:48:46.248 INFO BlockManagerInfo - Removed broadcast_418_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:46.249 INFO BlockManagerInfo - Removed broadcast_405_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:46.249 INFO BlockManagerInfo - Removed broadcast_408_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.7 MiB)
19:48:46.250 INFO BlockManagerInfo - Removed broadcast_410_piece0 on localhost:36125 in memory (size: 107.3 KiB, free: 1919.8 MiB)
19:48:46.250 INFO BlockManagerInfo - Removed broadcast_411_piece0 on localhost:36125 in memory (size: 58.1 KiB, free: 1919.9 MiB)
19:48:46.251 INFO BlockManagerInfo - Removed broadcast_412_piece0 on localhost:36125 in memory (size: 159.0 B, free: 1919.9 MiB)
19:48:46.252 INFO SparkHadoopWriter - Write Job job_202507151948465326271719382133485_1009 committed. Elapsed time: 5 ms.
19:48:46.261 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest68358288288624871549.sam
19:48:46.265 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest68358288288624871549.sam done
WARNING 2025-07-15 19:48:46 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-07-15 19:48:46 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:48:46.268 INFO MemoryStore - Block broadcast_422 stored as values in memory (estimated size 160.7 KiB, free 1919.0 MiB)
19:48:46.269 INFO MemoryStore - Block broadcast_422_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
19:48:46.269 INFO BlockManagerInfo - Added broadcast_422_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.9 MiB)
19:48:46.269 INFO SparkContext - Created broadcast 422 from broadcast at SamSource.java:78
19:48:46.270 INFO MemoryStore - Block broadcast_423 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
19:48:46.277 INFO MemoryStore - Block broadcast_423_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
19:48:46.277 INFO BlockManagerInfo - Added broadcast_423_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.8 MiB)
19:48:46.277 INFO SparkContext - Created broadcast 423 from newAPIHadoopFile at SamSource.java:108
19:48:46.279 INFO FileInputFormat - Total input files to process : 1
19:48:46.282 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:46.283 INFO DAGScheduler - Got job 158 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:46.283 INFO DAGScheduler - Final stage: ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:46.283 INFO DAGScheduler - Parents of final stage: List()
19:48:46.283 INFO DAGScheduler - Missing parents: List()
19:48:46.283 INFO DAGScheduler - Submitting ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:46.283 INFO MemoryStore - Block broadcast_424 stored as values in memory (estimated size 7.5 KiB, free 1918.7 MiB)
19:48:46.284 INFO MemoryStore - Block broadcast_424_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.7 MiB)
19:48:46.284 INFO BlockManagerInfo - Added broadcast_424_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.8 MiB)
19:48:46.284 INFO SparkContext - Created broadcast 424 from broadcast at DAGScheduler.scala:1580
19:48:46.284 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:46.284 INFO TaskSchedulerImpl - Adding task set 212.0 with 1 tasks resource profile 0
19:48:46.284 INFO TaskSetManager - Starting task 0.0 in stage 212.0 (TID 268) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:46.285 INFO Executor - Running task 0.0 in stage 212.0 (TID 268)
19:48:46.286 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest68358288288624871549.sam:0+847558
19:48:46.297 INFO Executor - Finished task 0.0 in stage 212.0 (TID 268). 651526 bytes result sent to driver
19:48:46.299 INFO TaskSetManager - Finished task 0.0 in stage 212.0 (TID 268) in 15 ms on localhost (executor driver) (1/1)
19:48:46.299 INFO TaskSchedulerImpl - Removed TaskSet 212.0, whose tasks have all completed, from pool
19:48:46.299 INFO DAGScheduler - ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.016 s
19:48:46.299 INFO DAGScheduler - Job 158 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:46.299 INFO TaskSchedulerImpl - Killing all running tasks in stage 212: Stage finished
19:48:46.299 INFO DAGScheduler - Job 158 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.016553 s
19:48:46.314 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:46.314 INFO DAGScheduler - Got job 159 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:46.314 INFO DAGScheduler - Final stage: ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185)
19:48:46.314 INFO DAGScheduler - Parents of final stage: List()
19:48:46.314 INFO DAGScheduler - Missing parents: List()
19:48:46.314 INFO DAGScheduler - Submitting ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:46.337 INFO MemoryStore - Block broadcast_425 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
19:48:46.339 INFO MemoryStore - Block broadcast_425_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.1 MiB)
19:48:46.339 INFO BlockManagerInfo - Added broadcast_425_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:46.339 INFO SparkContext - Created broadcast 425 from broadcast at DAGScheduler.scala:1580
19:48:46.339 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:46.339 INFO TaskSchedulerImpl - Adding task set 213.0 with 1 tasks resource profile 0
19:48:46.339 INFO TaskSetManager - Starting task 0.0 in stage 213.0 (TID 269) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:46.340 INFO Executor - Running task 0.0 in stage 213.0 (TID 269)
19:48:46.369 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:46.380 INFO Executor - Finished task 0.0 in stage 213.0 (TID 269). 989 bytes result sent to driver
19:48:46.380 INFO TaskSetManager - Finished task 0.0 in stage 213.0 (TID 269) in 41 ms on localhost (executor driver) (1/1)
19:48:46.380 INFO TaskSchedulerImpl - Removed TaskSet 213.0, whose tasks have all completed, from pool
19:48:46.381 INFO DAGScheduler - ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.066 s
19:48:46.381 INFO DAGScheduler - Job 159 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:46.381 INFO TaskSchedulerImpl - Killing all running tasks in stage 213: Stage finished
19:48:46.381 INFO DAGScheduler - Job 159 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066972 s
19:48:46.385 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:46.386 INFO DAGScheduler - Got job 160 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:46.386 INFO DAGScheduler - Final stage: ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185)
19:48:46.386 INFO DAGScheduler - Parents of final stage: List()
19:48:46.386 INFO DAGScheduler - Missing parents: List()
19:48:46.386 INFO DAGScheduler - Submitting ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:46.386 INFO MemoryStore - Block broadcast_426 stored as values in memory (estimated size 7.4 KiB, free 1918.1 MiB)
19:48:46.387 INFO MemoryStore - Block broadcast_426_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.1 MiB)
19:48:46.387 INFO BlockManagerInfo - Added broadcast_426_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.7 MiB)
19:48:46.387 INFO SparkContext - Created broadcast 426 from broadcast at DAGScheduler.scala:1580
19:48:46.387 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:46.387 INFO TaskSchedulerImpl - Adding task set 214.0 with 1 tasks resource profile 0
19:48:46.388 INFO TaskSetManager - Starting task 0.0 in stage 214.0 (TID 270) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:48:46.388 INFO Executor - Running task 0.0 in stage 214.0 (TID 270)
19:48:46.389 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest68358288288624871549.sam:0+847558
19:48:46.401 INFO Executor - Finished task 0.0 in stage 214.0 (TID 270). 946 bytes result sent to driver
19:48:46.401 INFO TaskSetManager - Finished task 0.0 in stage 214.0 (TID 270) in 13 ms on localhost (executor driver) (1/1)
19:48:46.401 INFO TaskSchedulerImpl - Removed TaskSet 214.0, whose tasks have all completed, from pool
19:48:46.401 INFO DAGScheduler - ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.015 s
19:48:46.401 INFO DAGScheduler - Job 160 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:46.401 INFO TaskSchedulerImpl - Killing all running tasks in stage 214: Stage finished
19:48:46.401 INFO DAGScheduler - Job 160 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.015891 s
19:48:46.414 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:46.415 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:46.416 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:46.416 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:46.420 INFO MemoryStore - Block broadcast_427 stored as values in memory (estimated size 297.9 KiB, free 1917.8 MiB)
19:48:46.430 INFO MemoryStore - Block broadcast_427_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
19:48:46.430 INFO BlockManagerInfo - Added broadcast_427_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:46.431 INFO SparkContext - Created broadcast 427 from newAPIHadoopFile at PathSplitSource.java:96
19:48:46.457 INFO MemoryStore - Block broadcast_428 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
19:48:46.463 INFO MemoryStore - Block broadcast_428_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.4 MiB)
19:48:46.463 INFO BlockManagerInfo - Added broadcast_428_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:46.463 INFO SparkContext - Created broadcast 428 from newAPIHadoopFile at PathSplitSource.java:96
19:48:46.483 INFO FileInputFormat - Total input files to process : 1
19:48:46.485 INFO MemoryStore - Block broadcast_429 stored as values in memory (estimated size 160.7 KiB, free 1917.3 MiB)
19:48:46.485 INFO MemoryStore - Block broadcast_429_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
19:48:46.485 INFO BlockManagerInfo - Added broadcast_429_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.6 MiB)
19:48:46.486 INFO SparkContext - Created broadcast 429 from broadcast at ReadsSparkSink.java:133
19:48:46.487 INFO MemoryStore - Block broadcast_430 stored as values in memory (estimated size 163.2 KiB, free 1917.1 MiB)
19:48:46.487 INFO MemoryStore - Block broadcast_430_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.1 MiB)
19:48:46.487 INFO BlockManagerInfo - Added broadcast_430_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:46.488 INFO SparkContext - Created broadcast 430 from broadcast at BamSink.java:76
19:48:46.489 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts dst=null perm=null proto=rpc
19:48:46.490 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:46.490 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:46.490 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:46.490 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:46.496 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:46.497 INFO DAGScheduler - Registering RDD 1028 (mapToPair at SparkUtils.java:161) as input to shuffle 43
19:48:46.497 INFO DAGScheduler - Got job 161 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:46.497 INFO DAGScheduler - Final stage: ResultStage 216 (runJob at SparkHadoopWriter.scala:83)
19:48:46.497 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 215)
19:48:46.497 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 215)
19:48:46.497 INFO DAGScheduler - Submitting ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:46.514 INFO MemoryStore - Block broadcast_431 stored as values in memory (estimated size 520.4 KiB, free 1916.6 MiB)
19:48:46.515 INFO MemoryStore - Block broadcast_431_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.4 MiB)
19:48:46.515 INFO BlockManagerInfo - Added broadcast_431_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.4 MiB)
19:48:46.515 INFO SparkContext - Created broadcast 431 from broadcast at DAGScheduler.scala:1580
19:48:46.516 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:46.516 INFO TaskSchedulerImpl - Adding task set 215.0 with 1 tasks resource profile 0
19:48:46.516 INFO TaskSetManager - Starting task 0.0 in stage 215.0 (TID 271) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:46.516 INFO Executor - Running task 0.0 in stage 215.0 (TID 271)
19:48:46.546 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:46.561 INFO Executor - Finished task 0.0 in stage 215.0 (TID 271). 1148 bytes result sent to driver
19:48:46.562 INFO TaskSetManager - Finished task 0.0 in stage 215.0 (TID 271) in 46 ms on localhost (executor driver) (1/1)
19:48:46.562 INFO TaskSchedulerImpl - Removed TaskSet 215.0, whose tasks have all completed, from pool
19:48:46.562 INFO DAGScheduler - ShuffleMapStage 215 (mapToPair at SparkUtils.java:161) finished in 0.065 s
19:48:46.562 INFO DAGScheduler - looking for newly runnable stages
19:48:46.562 INFO DAGScheduler - running: HashSet()
19:48:46.562 INFO DAGScheduler - waiting: HashSet(ResultStage 216)
19:48:46.562 INFO DAGScheduler - failed: HashSet()
19:48:46.562 INFO DAGScheduler - Submitting ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91), which has no missing parents
19:48:46.569 INFO MemoryStore - Block broadcast_432 stored as values in memory (estimated size 241.5 KiB, free 1916.2 MiB)
19:48:46.570 INFO MemoryStore - Block broadcast_432_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.1 MiB)
19:48:46.570 INFO BlockManagerInfo - Added broadcast_432_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.3 MiB)
19:48:46.570 INFO SparkContext - Created broadcast 432 from broadcast at DAGScheduler.scala:1580
19:48:46.570 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:46.570 INFO TaskSchedulerImpl - Adding task set 216.0 with 1 tasks resource profile 0
19:48:46.571 INFO TaskSetManager - Starting task 0.0 in stage 216.0 (TID 272) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:46.571 INFO Executor - Running task 0.0 in stage 216.0 (TID 272)
19:48:46.575 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:46.575 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:46.586 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:46.586 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:46.586 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:46.586 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:46.586 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:46.586 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:46.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:46.588 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:46.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:46.591 INFO StateChange - BLOCK* allocate blk_1073741871_1047, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/part-r-00000
19:48:46.593 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741871_1047 src: /127.0.0.1:55020 dest: /127.0.0.1:45925
19:48:46.596 INFO clienttrace - src: /127.0.0.1:55020, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741871_1047, duration(ns): 2120243
19:48:46.596 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741871_1047, type=LAST_IN_PIPELINE terminating
19:48:46.597 INFO FSNamesystem - BLOCK* blk_1073741871_1047 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/part-r-00000
19:48:46.997 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:46.998 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:46.999 INFO StateChange - BLOCK* allocate blk_1073741872_1048, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/.part-r-00000.sbi
19:48:47.000 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741872_1048 src: /127.0.0.1:55022 dest: /127.0.0.1:45925
19:48:47.001 INFO clienttrace - src: /127.0.0.1:55022, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741872_1048, duration(ns): 478566
19:48:47.001 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741872_1048, type=LAST_IN_PIPELINE terminating
19:48:47.002 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.003 INFO StateChange - BLOCK* allocate blk_1073741873_1049, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/.part-r-00000.bai
19:48:47.004 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741873_1049 src: /127.0.0.1:55034 dest: /127.0.0.1:45925
19:48:47.005 INFO clienttrace - src: /127.0.0.1:55034, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741873_1049, duration(ns): 251821
19:48:47.005 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741873_1049, type=LAST_IN_PIPELINE terminating
19:48:47.006 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.006 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0 dst=null perm=null proto=rpc
19:48:47.007 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0 dst=null perm=null proto=rpc
19:48:47.007 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/task_202507151948465819843690970245805_1033_r_000000 dst=null perm=null proto=rpc
19:48:47.008 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/_temporary/attempt_202507151948465819843690970245805_1033_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/task_202507151948465819843690970245805_1033_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:47.008 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948465819843690970245805_1033_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/task_202507151948465819843690970245805_1033_r_000000
19:48:47.008 INFO SparkHadoopMapRedUtil - attempt_202507151948465819843690970245805_1033_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:47.009 INFO Executor - Finished task 0.0 in stage 216.0 (TID 272). 1858 bytes result sent to driver
19:48:47.009 INFO TaskSetManager - Finished task 0.0 in stage 216.0 (TID 272) in 438 ms on localhost (executor driver) (1/1)
19:48:47.009 INFO TaskSchedulerImpl - Removed TaskSet 216.0, whose tasks have all completed, from pool
19:48:47.009 INFO DAGScheduler - ResultStage 216 (runJob at SparkHadoopWriter.scala:83) finished in 0.446 s
19:48:47.010 INFO DAGScheduler - Job 161 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:47.010 INFO TaskSchedulerImpl - Killing all running tasks in stage 216: Stage finished
19:48:47.010 INFO DAGScheduler - Job 161 finished: runJob at SparkHadoopWriter.scala:83, took 0.513302 s
19:48:47.010 INFO SparkHadoopWriter - Start to commit write Job job_202507151948465819843690970245805_1033.
19:48:47.010 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:47.011 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts dst=null perm=null proto=rpc
19:48:47.011 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/task_202507151948465819843690970245805_1033_r_000000 dst=null perm=null proto=rpc
19:48:47.012 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:47.012 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/task_202507151948465819843690970245805_1033_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.013 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:47.013 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/task_202507151948465819843690970245805_1033_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.014 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:47.014 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary/0/task_202507151948465819843690970245805_1033_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.015 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:47.016 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.016 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.017 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.spark-staging-1033 dst=null perm=null proto=rpc
19:48:47.017 INFO SparkHadoopWriter - Write Job job_202507151948465819843690970245805_1033 committed. Elapsed time: 7 ms.
19:48:47.018 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.019 INFO StateChange - BLOCK* allocate blk_1073741874_1050, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/header
19:48:47.020 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741874_1050 src: /127.0.0.1:55048 dest: /127.0.0.1:45925
19:48:47.021 INFO clienttrace - src: /127.0.0.1:55048, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741874_1050, duration(ns): 455423
19:48:47.021 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741874_1050, type=LAST_IN_PIPELINE terminating
19:48:47.022 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.023 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.024 INFO StateChange - BLOCK* allocate blk_1073741875_1051, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/terminator
19:48:47.025 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741875_1051 src: /127.0.0.1:55056 dest: /127.0.0.1:45925
19:48:47.026 INFO clienttrace - src: /127.0.0.1:55056, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741875_1051, duration(ns): 378088
19:48:47.026 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741875_1051, type=LAST_IN_PIPELINE terminating
19:48:47.026 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.027 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts dst=null perm=null proto=rpc
19:48:47.028 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.028 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.028 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam
19:48:47.029 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.029 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.030 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.030 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.030 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam done
19:48:47.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.031 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi
19:48:47.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts dst=null perm=null proto=rpc
19:48:47.032 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.033 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:47.033 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:47.034 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:47.035 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:47.035 INFO StateChange - BLOCK* allocate blk_1073741876_1052, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi
19:48:47.036 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741876_1052 src: /127.0.0.1:55078 dest: /127.0.0.1:45925
19:48:47.037 INFO clienttrace - src: /127.0.0.1:55078, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741876_1052, duration(ns): 368653
19:48:47.037 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741876_1052, type=LAST_IN_PIPELINE terminating
19:48:47.038 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.038 INFO IndexFileMerger - Done merging .sbi files
19:48:47.038 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai
19:48:47.038 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts dst=null perm=null proto=rpc
19:48:47.039 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.040 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:47.040 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:47.041 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:47.042 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:47.043 INFO StateChange - BLOCK* allocate blk_1073741877_1053, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai
19:48:47.044 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741877_1053 src: /127.0.0.1:55084 dest: /127.0.0.1:45925
19:48:47.045 INFO clienttrace - src: /127.0.0.1:55084, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741877_1053, duration(ns): 471994
19:48:47.045 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741877_1053, type=LAST_IN_PIPELINE terminating
19:48:47.046 INFO FSNamesystem - BLOCK* blk_1073741877_1053 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai
19:48:47.446 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.447 INFO IndexFileMerger - Done merging .bai files
19:48:47.447 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.parts dst=null perm=null proto=rpc
19:48:47.456 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi dst=null perm=null proto=rpc
19:48:47.464 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi dst=null perm=null proto=rpc
19:48:47.464 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi dst=null perm=null proto=rpc
19:48:47.465 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:47.465 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.467 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.467 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.468 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.468 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.469 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:47.471 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:47.472 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:47.472 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:47.472 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi dst=null perm=null proto=rpc
19:48:47.472 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi dst=null perm=null proto=rpc
19:48:47.473 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.sbi dst=null perm=null proto=rpc
19:48:47.474 INFO MemoryStore - Block broadcast_433 stored as values in memory (estimated size 320.0 B, free 1916.1 MiB)
19:48:47.479 INFO MemoryStore - Block broadcast_433_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.1 MiB)
19:48:47.479 INFO BlockManagerInfo - Added broadcast_433_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.3 MiB)
19:48:47.479 INFO SparkContext - Created broadcast 433 from broadcast at BamSource.java:104
19:48:47.479 INFO BlockManagerInfo - Removed broadcast_421_piece0 on localhost:36125 in memory (size: 66.9 KiB, free: 1919.4 MiB)
19:48:47.480 INFO BlockManagerInfo - Removed broadcast_430_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:47.480 INFO MemoryStore - Block broadcast_434 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
19:48:47.481 INFO BlockManagerInfo - Removed broadcast_417_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:47.484 INFO BlockManagerInfo - Removed broadcast_425_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:48:47.484 INFO BlockManagerInfo - Removed broadcast_426_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.6 MiB)
19:48:47.485 INFO BlockManagerInfo - Removed broadcast_424_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.6 MiB)
19:48:47.485 INFO BlockManagerInfo - Removed broadcast_423_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:47.486 INFO BlockManagerInfo - Removed broadcast_431_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:47.486 INFO BlockManagerInfo - Removed broadcast_419_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.8 MiB)
19:48:47.487 INFO BlockManagerInfo - Removed broadcast_432_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.9 MiB)
19:48:47.487 INFO BlockManagerInfo - Removed broadcast_429_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:47.488 INFO BlockManagerInfo - Removed broadcast_428_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:47.489 INFO BlockManagerInfo - Removed broadcast_422_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1920.0 MiB)
19:48:47.492 INFO MemoryStore - Block broadcast_434_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:47.492 INFO BlockManagerInfo - Added broadcast_434_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:47.492 INFO SparkContext - Created broadcast 434 from newAPIHadoopFile at PathSplitSource.java:96
19:48:47.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.507 INFO FileInputFormat - Total input files to process : 1
19:48:47.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.522 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:47.522 INFO DAGScheduler - Got job 162 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:47.522 INFO DAGScheduler - Final stage: ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:47.522 INFO DAGScheduler - Parents of final stage: List()
19:48:47.522 INFO DAGScheduler - Missing parents: List()
19:48:47.523 INFO DAGScheduler - Submitting ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:47.530 INFO MemoryStore - Block broadcast_435 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:47.531 INFO MemoryStore - Block broadcast_435_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:47.531 INFO BlockManagerInfo - Added broadcast_435_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:47.531 INFO SparkContext - Created broadcast 435 from broadcast at DAGScheduler.scala:1580
19:48:47.532 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:47.532 INFO TaskSchedulerImpl - Adding task set 217.0 with 1 tasks resource profile 0
19:48:47.532 INFO TaskSetManager - Starting task 0.0 in stage 217.0 (TID 273) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:47.532 INFO Executor - Running task 0.0 in stage 217.0 (TID 273)
19:48:47.544 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam:0+237038
19:48:47.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.548 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:47.550 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:47.551 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:47.551 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:47.553 INFO Executor - Finished task 0.0 in stage 217.0 (TID 273). 651483 bytes result sent to driver
19:48:47.555 INFO TaskSetManager - Finished task 0.0 in stage 217.0 (TID 273) in 23 ms on localhost (executor driver) (1/1)
19:48:47.555 INFO TaskSchedulerImpl - Removed TaskSet 217.0, whose tasks have all completed, from pool
19:48:47.555 INFO DAGScheduler - ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.032 s
19:48:47.555 INFO DAGScheduler - Job 162 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:47.555 INFO TaskSchedulerImpl - Killing all running tasks in stage 217: Stage finished
19:48:47.555 INFO DAGScheduler - Job 162 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.033097 s
19:48:47.564 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:47.565 INFO DAGScheduler - Got job 163 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:47.565 INFO DAGScheduler - Final stage: ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185)
19:48:47.565 INFO DAGScheduler - Parents of final stage: List()
19:48:47.565 INFO DAGScheduler - Missing parents: List()
19:48:47.565 INFO DAGScheduler - Submitting ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:47.581 INFO MemoryStore - Block broadcast_436 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:47.582 INFO MemoryStore - Block broadcast_436_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
19:48:47.582 INFO BlockManagerInfo - Added broadcast_436_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:47.583 INFO SparkContext - Created broadcast 436 from broadcast at DAGScheduler.scala:1580
19:48:47.583 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:47.583 INFO TaskSchedulerImpl - Adding task set 218.0 with 1 tasks resource profile 0
19:48:47.583 INFO TaskSetManager - Starting task 0.0 in stage 218.0 (TID 274) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:47.583 INFO Executor - Running task 0.0 in stage 218.0 (TID 274)
19:48:47.612 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:47.622 INFO Executor - Finished task 0.0 in stage 218.0 (TID 274). 989 bytes result sent to driver
19:48:47.622 INFO TaskSetManager - Finished task 0.0 in stage 218.0 (TID 274) in 39 ms on localhost (executor driver) (1/1)
19:48:47.622 INFO TaskSchedulerImpl - Removed TaskSet 218.0, whose tasks have all completed, from pool
19:48:47.622 INFO DAGScheduler - ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
19:48:47.622 INFO DAGScheduler - Job 163 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:47.622 INFO TaskSchedulerImpl - Killing all running tasks in stage 218: Stage finished
19:48:47.622 INFO DAGScheduler - Job 163 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057960 s
19:48:47.626 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:47.626 INFO DAGScheduler - Got job 164 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:47.626 INFO DAGScheduler - Final stage: ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185)
19:48:47.626 INFO DAGScheduler - Parents of final stage: List()
19:48:47.626 INFO DAGScheduler - Missing parents: List()
19:48:47.626 INFO DAGScheduler - Submitting ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:47.632 INFO MemoryStore - Block broadcast_437 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:47.632 INFO MemoryStore - Block broadcast_437_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
19:48:47.632 INFO BlockManagerInfo - Added broadcast_437_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:47.633 INFO SparkContext - Created broadcast 437 from broadcast at DAGScheduler.scala:1580
19:48:47.633 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:47.633 INFO TaskSchedulerImpl - Adding task set 219.0 with 1 tasks resource profile 0
19:48:47.633 INFO TaskSetManager - Starting task 0.0 in stage 219.0 (TID 275) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:47.633 INFO Executor - Running task 0.0 in stage 219.0 (TID 275)
19:48:47.644 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam:0+237038
19:48:47.644 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam dst=null perm=null proto=rpc
19:48:47.646 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.646 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.646 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eeee5049-8672-4ddd-95bb-a6223a705c84.bam.bai dst=null perm=null proto=rpc
19:48:47.648 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:47.649 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:47.649 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:47.650 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:47.651 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:47.652 INFO Executor - Finished task 0.0 in stage 219.0 (TID 275). 989 bytes result sent to driver
19:48:47.652 INFO TaskSetManager - Finished task 0.0 in stage 219.0 (TID 275) in 19 ms on localhost (executor driver) (1/1)
19:48:47.652 INFO TaskSchedulerImpl - Removed TaskSet 219.0, whose tasks have all completed, from pool
19:48:47.652 INFO DAGScheduler - ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
19:48:47.652 INFO DAGScheduler - Job 164 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:47.652 INFO TaskSchedulerImpl - Killing all running tasks in stage 219: Stage finished
19:48:47.652 INFO DAGScheduler - Job 164 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026758 s
19:48:47.660 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:47.661 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.662 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:47.662 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:47.664 INFO MemoryStore - Block broadcast_438 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:47.670 INFO MemoryStore - Block broadcast_438_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:47.670 INFO BlockManagerInfo - Added broadcast_438_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:47.671 INFO SparkContext - Created broadcast 438 from newAPIHadoopFile at PathSplitSource.java:96
19:48:47.693 INFO MemoryStore - Block broadcast_439 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
19:48:47.699 INFO MemoryStore - Block broadcast_439_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:47.699 INFO BlockManagerInfo - Added broadcast_439_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:47.699 INFO SparkContext - Created broadcast 439 from newAPIHadoopFile at PathSplitSource.java:96
19:48:47.720 INFO FileInputFormat - Total input files to process : 1
19:48:47.721 INFO MemoryStore - Block broadcast_440 stored as values in memory (estimated size 160.7 KiB, free 1917.5 MiB)
19:48:47.722 INFO MemoryStore - Block broadcast_440_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
19:48:47.722 INFO BlockManagerInfo - Added broadcast_440_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:47.722 INFO SparkContext - Created broadcast 440 from broadcast at ReadsSparkSink.java:133
19:48:47.723 INFO MemoryStore - Block broadcast_441 stored as values in memory (estimated size 163.2 KiB, free 1917.4 MiB)
19:48:47.724 INFO MemoryStore - Block broadcast_441_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
19:48:47.724 INFO BlockManagerInfo - Added broadcast_441_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:47.724 INFO SparkContext - Created broadcast 441 from broadcast at BamSink.java:76
19:48:47.726 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts dst=null perm=null proto=rpc
19:48:47.727 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:47.727 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:47.727 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:47.728 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:47.733 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:47.734 INFO DAGScheduler - Registering RDD 1053 (mapToPair at SparkUtils.java:161) as input to shuffle 44
19:48:47.734 INFO DAGScheduler - Got job 165 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:47.734 INFO DAGScheduler - Final stage: ResultStage 221 (runJob at SparkHadoopWriter.scala:83)
19:48:47.734 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 220)
19:48:47.734 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 220)
19:48:47.734 INFO DAGScheduler - Submitting ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:47.755 INFO MemoryStore - Block broadcast_442 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
19:48:47.757 INFO MemoryStore - Block broadcast_442_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
19:48:47.757 INFO BlockManagerInfo - Added broadcast_442_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.4 MiB)
19:48:47.757 INFO SparkContext - Created broadcast 442 from broadcast at DAGScheduler.scala:1580
19:48:47.757 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:47.757 INFO TaskSchedulerImpl - Adding task set 220.0 with 1 tasks resource profile 0
19:48:47.758 INFO TaskSetManager - Starting task 0.0 in stage 220.0 (TID 276) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:47.758 INFO Executor - Running task 0.0 in stage 220.0 (TID 276)
19:48:47.788 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:47.803 INFO Executor - Finished task 0.0 in stage 220.0 (TID 276). 1148 bytes result sent to driver
19:48:47.803 INFO TaskSetManager - Finished task 0.0 in stage 220.0 (TID 276) in 45 ms on localhost (executor driver) (1/1)
19:48:47.803 INFO TaskSchedulerImpl - Removed TaskSet 220.0, whose tasks have all completed, from pool
19:48:47.804 INFO DAGScheduler - ShuffleMapStage 220 (mapToPair at SparkUtils.java:161) finished in 0.069 s
19:48:47.804 INFO DAGScheduler - looking for newly runnable stages
19:48:47.804 INFO DAGScheduler - running: HashSet()
19:48:47.804 INFO DAGScheduler - waiting: HashSet(ResultStage 221)
19:48:47.804 INFO DAGScheduler - failed: HashSet()
19:48:47.804 INFO DAGScheduler - Submitting ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91), which has no missing parents
19:48:47.814 INFO MemoryStore - Block broadcast_443 stored as values in memory (estimated size 241.5 KiB, free 1916.4 MiB)
19:48:47.815 INFO MemoryStore - Block broadcast_443_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.4 MiB)
19:48:47.815 INFO BlockManagerInfo - Added broadcast_443_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.3 MiB)
19:48:47.815 INFO SparkContext - Created broadcast 443 from broadcast at DAGScheduler.scala:1580
19:48:47.815 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:47.815 INFO TaskSchedulerImpl - Adding task set 221.0 with 1 tasks resource profile 0
19:48:47.816 INFO TaskSetManager - Starting task 0.0 in stage 221.0 (TID 277) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:47.816 INFO Executor - Running task 0.0 in stage 221.0 (TID 277)
19:48:47.820 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:47.820 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:47.831 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:47.831 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:47.831 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:47.831 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:47.831 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:47.831 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:47.832 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.833 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.834 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:47.836 INFO StateChange - BLOCK* allocate blk_1073741878_1054, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/part-r-00000
19:48:47.837 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741878_1054 src: /127.0.0.1:53142 dest: /127.0.0.1:45925
19:48:47.839 INFO clienttrace - src: /127.0.0.1:53142, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741878_1054, duration(ns): 1062631
19:48:47.839 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741878_1054, type=LAST_IN_PIPELINE terminating
19:48:47.840 INFO FSNamesystem - BLOCK* blk_1073741878_1054 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/part-r-00000
19:48:48.240 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:48.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:48.241 INFO StateChange - BLOCK* allocate blk_1073741879_1055, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/.part-r-00000.sbi
19:48:48.242 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741879_1055 src: /127.0.0.1:53156 dest: /127.0.0.1:45925
19:48:48.243 INFO clienttrace - src: /127.0.0.1:53156, dest: /127.0.0.1:45925, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741879_1055, duration(ns): 448051
19:48:48.243 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741879_1055, type=LAST_IN_PIPELINE terminating
19:48:48.244 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:48.246 INFO StateChange - BLOCK* allocate blk_1073741880_1056, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/.part-r-00000.bai
19:48:48.246 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741880_1056 src: /127.0.0.1:53164 dest: /127.0.0.1:45925
19:48:48.247 INFO clienttrace - src: /127.0.0.1:53164, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741880_1056, duration(ns): 397837
19:48:48.247 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741880_1056, type=LAST_IN_PIPELINE terminating
19:48:48.248 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:48.249 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0 dst=null perm=null proto=rpc
19:48:48.249 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0 dst=null perm=null proto=rpc
19:48:48.250 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/task_202507151948475099496830906326704_1058_r_000000 dst=null perm=null proto=rpc
19:48:48.250 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/_temporary/attempt_202507151948475099496830906326704_1058_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/task_202507151948475099496830906326704_1058_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:48.251 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948475099496830906326704_1058_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/task_202507151948475099496830906326704_1058_r_000000
19:48:48.251 INFO SparkHadoopMapRedUtil - attempt_202507151948475099496830906326704_1058_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:48.251 INFO Executor - Finished task 0.0 in stage 221.0 (TID 277). 1858 bytes result sent to driver
19:48:48.251 INFO TaskSetManager - Finished task 0.0 in stage 221.0 (TID 277) in 435 ms on localhost (executor driver) (1/1)
19:48:48.251 INFO TaskSchedulerImpl - Removed TaskSet 221.0, whose tasks have all completed, from pool
19:48:48.252 INFO DAGScheduler - ResultStage 221 (runJob at SparkHadoopWriter.scala:83) finished in 0.447 s
19:48:48.252 INFO DAGScheduler - Job 165 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:48.252 INFO TaskSchedulerImpl - Killing all running tasks in stage 221: Stage finished
19:48:48.252 INFO DAGScheduler - Job 165 finished: runJob at SparkHadoopWriter.scala:83, took 0.518361 s
19:48:48.252 INFO SparkHadoopWriter - Start to commit write Job job_202507151948475099496830906326704_1058.
19:48:48.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:48.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts dst=null perm=null proto=rpc
19:48:48.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/task_202507151948475099496830906326704_1058_r_000000 dst=null perm=null proto=rpc
19:48:48.254 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:48.254 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/task_202507151948475099496830906326704_1058_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.255 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:48.255 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/task_202507151948475099496830906326704_1058_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.256 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:48.256 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary/0/task_202507151948475099496830906326704_1058_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.257 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:48.258 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.258 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:48.259 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.spark-staging-1058 dst=null perm=null proto=rpc
19:48:48.259 INFO SparkHadoopWriter - Write Job job_202507151948475099496830906326704_1058 committed. Elapsed time: 6 ms.
19:48:48.260 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.261 INFO StateChange - BLOCK* allocate blk_1073741881_1057, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/header
19:48:48.262 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741881_1057 src: /127.0.0.1:53172 dest: /127.0.0.1:45925
19:48:48.263 INFO clienttrace - src: /127.0.0.1:53172, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741881_1057, duration(ns): 442838
19:48:48.263 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741881_1057, type=LAST_IN_PIPELINE terminating
19:48:48.263 INFO FSNamesystem - BLOCK* blk_1073741881_1057 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/header
19:48:48.664 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:48.665 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.666 INFO StateChange - BLOCK* allocate blk_1073741882_1058, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/terminator
19:48:48.667 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741882_1058 src: /127.0.0.1:53180 dest: /127.0.0.1:45925
19:48:48.668 INFO clienttrace - src: /127.0.0.1:53180, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741882_1058, duration(ns): 433109
19:48:48.668 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741882_1058, type=LAST_IN_PIPELINE terminating
19:48:48.668 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:48.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts dst=null perm=null proto=rpc
19:48:48.670 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.670 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:48.671 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam
19:48:48.671 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.671 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:48.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:48.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.673 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam done
19:48:48.673 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:48.673 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi
19:48:48.673 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts dst=null perm=null proto=rpc
19:48:48.674 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:48.675 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:48.675 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:48.676 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
19:48:48.677 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:48.677 INFO StateChange - BLOCK* allocate blk_1073741883_1059, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi
19:48:48.678 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741883_1059 src: /127.0.0.1:53194 dest: /127.0.0.1:45925
19:48:48.679 INFO clienttrace - src: /127.0.0.1:53194, dest: /127.0.0.1:45925, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741883_1059, duration(ns): 393357
19:48:48.679 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741883_1059, type=LAST_IN_PIPELINE terminating
19:48:48.680 INFO FSNamesystem - BLOCK* blk_1073741883_1059 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi
19:48:49.080 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:49.081 INFO IndexFileMerger - Done merging .sbi files
19:48:49.081 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai
19:48:49.081 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts dst=null perm=null proto=rpc
19:48:49.082 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:49.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:49.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:49.084 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:49.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:49.085 INFO StateChange - BLOCK* allocate blk_1073741884_1060, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai
19:48:49.086 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741884_1060 src: /127.0.0.1:53204 dest: /127.0.0.1:45925
19:48:49.087 INFO clienttrace - src: /127.0.0.1:53204, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741884_1060, duration(ns): 404156
19:48:49.087 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741884_1060, type=LAST_IN_PIPELINE terminating
19:48:49.088 INFO FSNamesystem - BLOCK* blk_1073741884_1060 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai
19:48:49.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741872_1048 replica FinalizedReplica, blk_1073741872_1048, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741872 for deletion
19:48:49.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741873_1049 replica FinalizedReplica, blk_1073741873_1049, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741873 for deletion
19:48:49.415 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741879_1055 replica FinalizedReplica, blk_1073741879_1055, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741879 for deletion
19:48:49.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741872_1048 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741872
19:48:49.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741873_1049 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741873
19:48:49.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741879_1055 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741879
19:48:49.488 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:49.489 INFO IndexFileMerger - Done merging .bai files
19:48:49.489 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.parts dst=null perm=null proto=rpc
19:48:49.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi dst=null perm=null proto=rpc
19:48:49.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi dst=null perm=null proto=rpc
19:48:49.506 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi dst=null perm=null proto=rpc
19:48:49.506 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
19:48:49.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.508 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.508 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.511 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:49.512 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:49.512 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:49.513 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:49.513 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi dst=null perm=null proto=rpc
19:48:49.513 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi dst=null perm=null proto=rpc
19:48:49.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.sbi dst=null perm=null proto=rpc
19:48:49.515 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
19:48:49.515 INFO MemoryStore - Block broadcast_444 stored as values in memory (estimated size 13.3 KiB, free 1916.4 MiB)
19:48:49.519 INFO MemoryStore - Block broadcast_444_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1916.3 MiB)
19:48:49.519 INFO BlockManagerInfo - Added broadcast_444_piece0 in memory on localhost:36125 (size: 8.3 KiB, free: 1919.3 MiB)
19:48:49.520 INFO SparkContext - Created broadcast 444 from broadcast at BamSource.java:104
19:48:49.520 INFO BlockManagerInfo - Removed broadcast_437_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.3 MiB)
19:48:49.520 INFO BlockManagerInfo - Removed broadcast_439_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:49.521 INFO BlockManagerInfo - Removed broadcast_435_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.4 MiB)
19:48:49.521 INFO MemoryStore - Block broadcast_445 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
19:48:49.521 INFO BlockManagerInfo - Removed broadcast_436_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:48:49.521 INFO BlockManagerInfo - Removed broadcast_427_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:49.522 INFO BlockManagerInfo - Removed broadcast_440_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.7 MiB)
19:48:49.522 INFO BlockManagerInfo - Removed broadcast_442_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:49.523 INFO BlockManagerInfo - Removed broadcast_434_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:49.523 INFO BlockManagerInfo - Removed broadcast_441_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:49.524 INFO BlockManagerInfo - Removed broadcast_433_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.9 MiB)
19:48:49.524 INFO BlockManagerInfo - Removed broadcast_443_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.9 MiB)
19:48:49.530 INFO MemoryStore - Block broadcast_445_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:49.530 INFO BlockManagerInfo - Added broadcast_445_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:49.530 INFO SparkContext - Created broadcast 445 from newAPIHadoopFile at PathSplitSource.java:96
19:48:49.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.539 INFO FileInputFormat - Total input files to process : 1
19:48:49.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.554 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:49.555 INFO DAGScheduler - Got job 166 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:49.555 INFO DAGScheduler - Final stage: ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:49.555 INFO DAGScheduler - Parents of final stage: List()
19:48:49.555 INFO DAGScheduler - Missing parents: List()
19:48:49.555 INFO DAGScheduler - Submitting ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:49.561 INFO MemoryStore - Block broadcast_446 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:49.561 INFO MemoryStore - Block broadcast_446_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:49.562 INFO BlockManagerInfo - Added broadcast_446_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:49.562 INFO SparkContext - Created broadcast 446 from broadcast at DAGScheduler.scala:1580
19:48:49.562 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:49.562 INFO TaskSchedulerImpl - Adding task set 222.0 with 1 tasks resource profile 0
19:48:49.562 INFO TaskSetManager - Starting task 0.0 in stage 222.0 (TID 278) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:49.563 INFO Executor - Running task 0.0 in stage 222.0 (TID 278)
19:48:49.574 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam:0+237038
19:48:49.574 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.575 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.575 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.576 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.576 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.578 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:49.579 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:49.580 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:49.581 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:49.581 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:49.583 INFO Executor - Finished task 0.0 in stage 222.0 (TID 278). 651483 bytes result sent to driver
19:48:49.585 INFO TaskSetManager - Finished task 0.0 in stage 222.0 (TID 278) in 23 ms on localhost (executor driver) (1/1)
19:48:49.585 INFO TaskSchedulerImpl - Removed TaskSet 222.0, whose tasks have all completed, from pool
19:48:49.585 INFO DAGScheduler - ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.030 s
19:48:49.585 INFO DAGScheduler - Job 166 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:49.585 INFO TaskSchedulerImpl - Killing all running tasks in stage 222: Stage finished
19:48:49.585 INFO DAGScheduler - Job 166 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.030628 s
19:48:49.594 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:49.594 INFO DAGScheduler - Got job 167 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:49.594 INFO DAGScheduler - Final stage: ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185)
19:48:49.594 INFO DAGScheduler - Parents of final stage: List()
19:48:49.594 INFO DAGScheduler - Missing parents: List()
19:48:49.594 INFO DAGScheduler - Submitting ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:49.611 INFO MemoryStore - Block broadcast_447 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:49.612 INFO MemoryStore - Block broadcast_447_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.5 MiB)
19:48:49.612 INFO BlockManagerInfo - Added broadcast_447_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:49.612 INFO SparkContext - Created broadcast 447 from broadcast at DAGScheduler.scala:1580
19:48:49.613 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:49.613 INFO TaskSchedulerImpl - Adding task set 223.0 with 1 tasks resource profile 0
19:48:49.613 INFO TaskSetManager - Starting task 0.0 in stage 223.0 (TID 279) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:49.613 INFO Executor - Running task 0.0 in stage 223.0 (TID 279)
19:48:49.642 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:49.651 INFO Executor - Finished task 0.0 in stage 223.0 (TID 279). 989 bytes result sent to driver
19:48:49.652 INFO TaskSetManager - Finished task 0.0 in stage 223.0 (TID 279) in 39 ms on localhost (executor driver) (1/1)
19:48:49.652 INFO TaskSchedulerImpl - Removed TaskSet 223.0, whose tasks have all completed, from pool
19:48:49.652 INFO DAGScheduler - ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
19:48:49.652 INFO DAGScheduler - Job 167 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:49.652 INFO TaskSchedulerImpl - Killing all running tasks in stage 223: Stage finished
19:48:49.652 INFO DAGScheduler - Job 167 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058006 s
19:48:49.655 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:49.656 INFO DAGScheduler - Got job 168 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:49.656 INFO DAGScheduler - Final stage: ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185)
19:48:49.656 INFO DAGScheduler - Parents of final stage: List()
19:48:49.656 INFO DAGScheduler - Missing parents: List()
19:48:49.656 INFO DAGScheduler - Submitting ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:49.662 INFO MemoryStore - Block broadcast_448 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:49.663 INFO MemoryStore - Block broadcast_448_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.3 MiB)
19:48:49.663 INFO BlockManagerInfo - Added broadcast_448_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:49.663 INFO SparkContext - Created broadcast 448 from broadcast at DAGScheduler.scala:1580
19:48:49.663 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:49.663 INFO TaskSchedulerImpl - Adding task set 224.0 with 1 tasks resource profile 0
19:48:49.663 INFO TaskSetManager - Starting task 0.0 in stage 224.0 (TID 280) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:49.664 INFO Executor - Running task 0.0 in stage 224.0 (TID 280)
19:48:49.674 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam:0+237038
19:48:49.675 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.676 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam dst=null perm=null proto=rpc
19:48:49.676 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.677 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.677 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_63ed2927-e177-4259-955b-32f7f3438c6c.bam.bai dst=null perm=null proto=rpc
19:48:49.680 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:49.680 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:49.681 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:49.682 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:49.683 INFO Executor - Finished task 0.0 in stage 224.0 (TID 280). 989 bytes result sent to driver
19:48:49.684 INFO TaskSetManager - Finished task 0.0 in stage 224.0 (TID 280) in 21 ms on localhost (executor driver) (1/1)
19:48:49.684 INFO TaskSchedulerImpl - Removed TaskSet 224.0, whose tasks have all completed, from pool
19:48:49.684 INFO DAGScheduler - ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.028 s
19:48:49.684 INFO DAGScheduler - Job 168 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:49.684 INFO TaskSchedulerImpl - Killing all running tasks in stage 224: Stage finished
19:48:49.684 INFO DAGScheduler - Job 168 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.028421 s
19:48:49.692 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:49.693 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:49.694 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:49.694 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:49.696 INFO MemoryStore - Block broadcast_449 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
19:48:49.702 INFO MemoryStore - Block broadcast_449_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:49.702 INFO BlockManagerInfo - Added broadcast_449_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:49.703 INFO SparkContext - Created broadcast 449 from newAPIHadoopFile at PathSplitSource.java:96
19:48:49.724 INFO MemoryStore - Block broadcast_450 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
19:48:49.730 INFO MemoryStore - Block broadcast_450_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:49.730 INFO BlockManagerInfo - Added broadcast_450_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:49.730 INFO SparkContext - Created broadcast 450 from newAPIHadoopFile at PathSplitSource.java:96
19:48:49.749 INFO FileInputFormat - Total input files to process : 1
19:48:49.751 INFO MemoryStore - Block broadcast_451 stored as values in memory (estimated size 160.7 KiB, free 1917.5 MiB)
19:48:49.752 INFO MemoryStore - Block broadcast_451_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
19:48:49.752 INFO BlockManagerInfo - Added broadcast_451_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:49.752 INFO SparkContext - Created broadcast 451 from broadcast at ReadsSparkSink.java:133
19:48:49.752 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:49.753 INFO MemoryStore - Block broadcast_452 stored as values in memory (estimated size 163.2 KiB, free 1917.3 MiB)
19:48:49.754 INFO MemoryStore - Block broadcast_452_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
19:48:49.754 INFO BlockManagerInfo - Added broadcast_452_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:49.754 INFO SparkContext - Created broadcast 452 from broadcast at BamSink.java:76
19:48:49.756 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts dst=null perm=null proto=rpc
19:48:49.756 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:49.756 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:49.756 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:49.757 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:49.763 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:49.763 INFO DAGScheduler - Registering RDD 1078 (mapToPair at SparkUtils.java:161) as input to shuffle 45
19:48:49.763 INFO DAGScheduler - Got job 169 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:49.764 INFO DAGScheduler - Final stage: ResultStage 226 (runJob at SparkHadoopWriter.scala:83)
19:48:49.764 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 225)
19:48:49.764 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 225)
19:48:49.764 INFO DAGScheduler - Submitting ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:49.780 INFO MemoryStore - Block broadcast_453 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
19:48:49.782 INFO MemoryStore - Block broadcast_453_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
19:48:49.782 INFO BlockManagerInfo - Added broadcast_453_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.4 MiB)
19:48:49.782 INFO SparkContext - Created broadcast 453 from broadcast at DAGScheduler.scala:1580
19:48:49.782 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:49.782 INFO TaskSchedulerImpl - Adding task set 225.0 with 1 tasks resource profile 0
19:48:49.783 INFO TaskSetManager - Starting task 0.0 in stage 225.0 (TID 281) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:49.783 INFO Executor - Running task 0.0 in stage 225.0 (TID 281)
19:48:49.813 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:49.827 INFO Executor - Finished task 0.0 in stage 225.0 (TID 281). 1148 bytes result sent to driver
19:48:49.827 INFO TaskSetManager - Finished task 0.0 in stage 225.0 (TID 281) in 44 ms on localhost (executor driver) (1/1)
19:48:49.828 INFO TaskSchedulerImpl - Removed TaskSet 225.0, whose tasks have all completed, from pool
19:48:49.828 INFO DAGScheduler - ShuffleMapStage 225 (mapToPair at SparkUtils.java:161) finished in 0.064 s
19:48:49.828 INFO DAGScheduler - looking for newly runnable stages
19:48:49.828 INFO DAGScheduler - running: HashSet()
19:48:49.828 INFO DAGScheduler - waiting: HashSet(ResultStage 226)
19:48:49.828 INFO DAGScheduler - failed: HashSet()
19:48:49.828 INFO DAGScheduler - Submitting ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91), which has no missing parents
19:48:49.837 INFO MemoryStore - Block broadcast_454 stored as values in memory (estimated size 241.5 KiB, free 1916.4 MiB)
19:48:49.838 INFO MemoryStore - Block broadcast_454_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.4 MiB)
19:48:49.838 INFO BlockManagerInfo - Added broadcast_454_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.3 MiB)
19:48:49.838 INFO SparkContext - Created broadcast 454 from broadcast at DAGScheduler.scala:1580
19:48:49.838 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:49.838 INFO TaskSchedulerImpl - Adding task set 226.0 with 1 tasks resource profile 0
19:48:49.839 INFO TaskSetManager - Starting task 0.0 in stage 226.0 (TID 282) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:49.839 INFO Executor - Running task 0.0 in stage 226.0 (TID 282)
19:48:49.843 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:49.843 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:49.856 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:49.856 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:49.857 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:49.857 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:49.857 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:49.857 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:49.858 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:49.859 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:49.861 INFO StateChange - BLOCK* allocate blk_1073741885_1061, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/part-r-00000
19:48:49.862 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741885_1061 src: /127.0.0.1:53232 dest: /127.0.0.1:45925
19:48:49.863 INFO clienttrace - src: /127.0.0.1:53232, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741885_1061, duration(ns): 1065172
19:48:49.864 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741885_1061, type=LAST_IN_PIPELINE terminating
19:48:49.864 INFO FSNamesystem - BLOCK* blk_1073741885_1061 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/part-r-00000
19:48:50.265 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:50.265 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:50.267 INFO StateChange - BLOCK* allocate blk_1073741886_1062, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/.part-r-00000.bai
19:48:50.267 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741886_1062 src: /127.0.0.1:53236 dest: /127.0.0.1:45925
19:48:50.268 INFO clienttrace - src: /127.0.0.1:53236, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741886_1062, duration(ns): 373260
19:48:50.268 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741886_1062, type=LAST_IN_PIPELINE terminating
19:48:50.269 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:50.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0 dst=null perm=null proto=rpc
19:48:50.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0 dst=null perm=null proto=rpc
19:48:50.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/task_202507151948494417002334924721652_1083_r_000000 dst=null perm=null proto=rpc
19:48:50.271 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/_temporary/attempt_202507151948494417002334924721652_1083_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/task_202507151948494417002334924721652_1083_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:50.271 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948494417002334924721652_1083_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/task_202507151948494417002334924721652_1083_r_000000
19:48:50.271 INFO SparkHadoopMapRedUtil - attempt_202507151948494417002334924721652_1083_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:50.272 INFO Executor - Finished task 0.0 in stage 226.0 (TID 282). 1858 bytes result sent to driver
19:48:50.272 INFO TaskSetManager - Finished task 0.0 in stage 226.0 (TID 282) in 433 ms on localhost (executor driver) (1/1)
19:48:50.272 INFO TaskSchedulerImpl - Removed TaskSet 226.0, whose tasks have all completed, from pool
19:48:50.272 INFO DAGScheduler - ResultStage 226 (runJob at SparkHadoopWriter.scala:83) finished in 0.444 s
19:48:50.272 INFO DAGScheduler - Job 169 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:50.272 INFO TaskSchedulerImpl - Killing all running tasks in stage 226: Stage finished
19:48:50.273 INFO DAGScheduler - Job 169 finished: runJob at SparkHadoopWriter.scala:83, took 0.509525 s
19:48:50.273 INFO SparkHadoopWriter - Start to commit write Job job_202507151948494417002334924721652_1083.
19:48:50.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:50.274 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts dst=null perm=null proto=rpc
19:48:50.274 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/task_202507151948494417002334924721652_1083_r_000000 dst=null perm=null proto=rpc
19:48:50.274 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:50.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/task_202507151948494417002334924721652_1083_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:50.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:50.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary/0/task_202507151948494417002334924721652_1083_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:50.276 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:50.277 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:50.277 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:50.278 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/.spark-staging-1083 dst=null perm=null proto=rpc
19:48:50.278 INFO SparkHadoopWriter - Write Job job_202507151948494417002334924721652_1083 committed. Elapsed time: 5 ms.
19:48:50.278 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:50.280 INFO StateChange - BLOCK* allocate blk_1073741887_1063, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/header
19:48:50.280 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741887_1063 src: /127.0.0.1:53240 dest: /127.0.0.1:45925
19:48:50.281 INFO clienttrace - src: /127.0.0.1:53240, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741887_1063, duration(ns): 391444
19:48:50.281 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741887_1063, type=LAST_IN_PIPELINE terminating
19:48:50.282 INFO FSNamesystem - BLOCK* blk_1073741887_1063 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/header
19:48:50.683 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:50.684 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:50.684 INFO StateChange - BLOCK* allocate blk_1073741888_1064, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/terminator
19:48:50.685 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741888_1064 src: /127.0.0.1:53248 dest: /127.0.0.1:45925
19:48:50.686 INFO clienttrace - src: /127.0.0.1:53248, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741888_1064, duration(ns): 373059
19:48:50.686 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741888_1064, type=LAST_IN_PIPELINE terminating
19:48:50.687 INFO FSNamesystem - BLOCK* blk_1073741888_1064 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/terminator
19:48:51.087 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:51.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts dst=null perm=null proto=rpc
19:48:51.089 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:51.089 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:51.089 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam
19:48:51.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:51.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.091 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.091 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:51.091 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam done
19:48:51.092 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.092 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai
19:48:51.092 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts dst=null perm=null proto=rpc
19:48:51.093 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:51.093 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:51.094 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:51.095 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.095 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:51.096 INFO StateChange - BLOCK* allocate blk_1073741889_1065, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai
19:48:51.097 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741889_1065 src: /127.0.0.1:53252 dest: /127.0.0.1:45925
19:48:51.098 INFO clienttrace - src: /127.0.0.1:53252, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741889_1065, duration(ns): 446356
19:48:51.098 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741889_1065, type=LAST_IN_PIPELINE terminating
19:48:51.098 INFO FSNamesystem - BLOCK* blk_1073741889_1065 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai
19:48:51.499 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:51.499 INFO IndexFileMerger - Done merging .bai files
19:48:51.500 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.parts dst=null perm=null proto=rpc
19:48:51.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.509 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.510 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.510 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.511 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.511 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.512 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.513 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.514 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.515 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.515 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.sbi dst=null perm=null proto=rpc
19:48:51.516 INFO MemoryStore - Block broadcast_455 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
19:48:51.522 INFO BlockManagerInfo - Removed broadcast_454_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.4 MiB)
19:48:51.522 INFO BlockManagerInfo - Removed broadcast_448_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.4 MiB)
19:48:51.522 INFO BlockManagerInfo - Removed broadcast_451_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.4 MiB)
19:48:51.523 INFO BlockManagerInfo - Removed broadcast_445_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:51.524 INFO BlockManagerInfo - Removed broadcast_438_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:51.524 INFO BlockManagerInfo - Removed broadcast_452_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.5 MiB)
19:48:51.524 INFO BlockManagerInfo - Removed broadcast_453_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.7 MiB)
19:48:51.525 INFO BlockManagerInfo - Removed broadcast_444_piece0 on localhost:36125 in memory (size: 8.3 KiB, free: 1919.7 MiB)
19:48:51.525 INFO BlockManagerInfo - Removed broadcast_450_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:51.525 INFO BlockManagerInfo - Removed broadcast_447_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.9 MiB)
19:48:51.526 INFO BlockManagerInfo - Removed broadcast_446_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1920.0 MiB)
19:48:51.529 INFO MemoryStore - Block broadcast_455_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:51.529 INFO BlockManagerInfo - Added broadcast_455_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:51.529 INFO SparkContext - Created broadcast 455 from newAPIHadoopFile at PathSplitSource.java:96
19:48:51.556 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.556 INFO FileInputFormat - Total input files to process : 1
19:48:51.557 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.593 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:51.593 INFO DAGScheduler - Got job 170 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:51.593 INFO DAGScheduler - Final stage: ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:51.593 INFO DAGScheduler - Parents of final stage: List()
19:48:51.593 INFO DAGScheduler - Missing parents: List()
19:48:51.593 INFO DAGScheduler - Submitting ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:51.609 INFO MemoryStore - Block broadcast_456 stored as values in memory (estimated size 426.2 KiB, free 1918.9 MiB)
19:48:51.611 INFO MemoryStore - Block broadcast_456_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.8 MiB)
19:48:51.611 INFO BlockManagerInfo - Added broadcast_456_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.8 MiB)
19:48:51.611 INFO SparkContext - Created broadcast 456 from broadcast at DAGScheduler.scala:1580
19:48:51.611 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:51.611 INFO TaskSchedulerImpl - Adding task set 227.0 with 1 tasks resource profile 0
19:48:51.612 INFO TaskSetManager - Starting task 0.0 in stage 227.0 (TID 283) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:51.612 INFO Executor - Running task 0.0 in stage 227.0 (TID 283)
19:48:51.641 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam:0+237038
19:48:51.642 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.642 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.644 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.644 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.644 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.646 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.647 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.649 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.649 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.651 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.656 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.657 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.658 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.658 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.659 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.660 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.661 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.661 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.662 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.662 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.663 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.664 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.664 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.665 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.666 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.666 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.667 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.668 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.668 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.669 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.669 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.670 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.671 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.671 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.672 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.673 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.674 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.675 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.675 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.676 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.677 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.677 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.678 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.679 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.679 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.680 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.681 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.682 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.683 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.683 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.684 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.686 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.687 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.688 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.688 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.689 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.690 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.691 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.691 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.692 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.693 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.694 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.695 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.695 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.697 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.698 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.699 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.700 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.700 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.701 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.701 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.702 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.703 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.703 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.705 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.706 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.707 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.708 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.709 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:51.712 INFO Executor - Finished task 0.0 in stage 227.0 (TID 283). 651526 bytes result sent to driver
19:48:51.713 INFO TaskSetManager - Finished task 0.0 in stage 227.0 (TID 283) in 101 ms on localhost (executor driver) (1/1)
19:48:51.713 INFO TaskSchedulerImpl - Removed TaskSet 227.0, whose tasks have all completed, from pool
19:48:51.713 INFO DAGScheduler - ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.120 s
19:48:51.714 INFO DAGScheduler - Job 170 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:51.714 INFO TaskSchedulerImpl - Killing all running tasks in stage 227: Stage finished
19:48:51.714 INFO DAGScheduler - Job 170 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.121159 s
19:48:51.723 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:51.723 INFO DAGScheduler - Got job 171 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:51.723 INFO DAGScheduler - Final stage: ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185)
19:48:51.723 INFO DAGScheduler - Parents of final stage: List()
19:48:51.723 INFO DAGScheduler - Missing parents: List()
19:48:51.723 INFO DAGScheduler - Submitting ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:51.740 INFO MemoryStore - Block broadcast_457 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
19:48:51.741 INFO MemoryStore - Block broadcast_457_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
19:48:51.742 INFO BlockManagerInfo - Added broadcast_457_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.6 MiB)
19:48:51.742 INFO SparkContext - Created broadcast 457 from broadcast at DAGScheduler.scala:1580
19:48:51.742 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:51.742 INFO TaskSchedulerImpl - Adding task set 228.0 with 1 tasks resource profile 0
19:48:51.742 INFO TaskSetManager - Starting task 0.0 in stage 228.0 (TID 284) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:51.743 INFO Executor - Running task 0.0 in stage 228.0 (TID 284)
19:48:51.772 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:51.781 INFO Executor - Finished task 0.0 in stage 228.0 (TID 284). 989 bytes result sent to driver
19:48:51.782 INFO TaskSetManager - Finished task 0.0 in stage 228.0 (TID 284) in 40 ms on localhost (executor driver) (1/1)
19:48:51.782 INFO TaskSchedulerImpl - Removed TaskSet 228.0, whose tasks have all completed, from pool
19:48:51.782 INFO DAGScheduler - ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:48:51.782 INFO DAGScheduler - Job 171 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:51.782 INFO TaskSchedulerImpl - Killing all running tasks in stage 228: Stage finished
19:48:51.782 INFO DAGScheduler - Job 171 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058999 s
19:48:51.787 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:51.787 INFO DAGScheduler - Got job 172 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:51.787 INFO DAGScheduler - Final stage: ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185)
19:48:51.787 INFO DAGScheduler - Parents of final stage: List()
19:48:51.787 INFO DAGScheduler - Missing parents: List()
19:48:51.787 INFO DAGScheduler - Submitting ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:51.804 INFO MemoryStore - Block broadcast_458 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
19:48:51.805 INFO MemoryStore - Block broadcast_458_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
19:48:51.805 INFO BlockManagerInfo - Added broadcast_458_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:51.805 INFO SparkContext - Created broadcast 458 from broadcast at DAGScheduler.scala:1580
19:48:51.805 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:51.805 INFO TaskSchedulerImpl - Adding task set 229.0 with 1 tasks resource profile 0
19:48:51.806 INFO TaskSetManager - Starting task 0.0 in stage 229.0 (TID 285) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:51.806 INFO Executor - Running task 0.0 in stage 229.0 (TID 285)
19:48:51.835 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam:0+237038
19:48:51.836 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.836 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.837 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.838 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.838 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.839 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.839 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.840 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.841 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.844 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.844 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.845 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.845 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.847 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.852 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.852 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.853 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.854 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.855 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.855 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.856 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.857 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.857 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.858 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.859 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.860 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.861 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.861 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.862 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.863 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.863 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.864 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.865 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.865 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.866 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.867 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.868 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.869 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.870 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.870 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.871 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.872 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.873 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.873 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.874 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.875 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.875 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.876 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.877 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.877 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.878 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.879 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.880 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.880 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.881 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.881 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.882 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.883 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.883 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.884 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.885 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.886 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.886 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.887 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.889 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.890 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.891 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.891 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.892 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.893 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.894 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.895 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.896 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.896 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.897 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:51.898 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam dst=null perm=null proto=rpc
19:48:51.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.900 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.900 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_53ccde64-53db-4ac0-9463-128c83c06a9a.bam.bai dst=null perm=null proto=rpc
19:48:51.902 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:51.904 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.904 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:51.906 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:51.906 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:51.908 INFO Executor - Finished task 0.0 in stage 229.0 (TID 285). 989 bytes result sent to driver
19:48:51.908 INFO TaskSetManager - Finished task 0.0 in stage 229.0 (TID 285) in 102 ms on localhost (executor driver) (1/1)
19:48:51.908 INFO TaskSchedulerImpl - Removed TaskSet 229.0, whose tasks have all completed, from pool
19:48:51.909 INFO DAGScheduler - ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.122 s
19:48:51.909 INFO DAGScheduler - Job 172 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:51.909 INFO TaskSchedulerImpl - Killing all running tasks in stage 229: Stage finished
19:48:51.909 INFO DAGScheduler - Job 172 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.122207 s
19:48:51.918 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:51.919 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:51.919 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:51.920 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:51.922 INFO MemoryStore - Block broadcast_459 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
19:48:51.929 INFO MemoryStore - Block broadcast_459_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
19:48:51.929 INFO BlockManagerInfo - Added broadcast_459_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:51.929 INFO SparkContext - Created broadcast 459 from newAPIHadoopFile at PathSplitSource.java:96
19:48:51.950 INFO MemoryStore - Block broadcast_460 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
19:48:51.956 INFO MemoryStore - Block broadcast_460_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
19:48:51.957 INFO BlockManagerInfo - Added broadcast_460_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:51.957 INFO SparkContext - Created broadcast 460 from newAPIHadoopFile at PathSplitSource.java:96
19:48:51.978 INFO FileInputFormat - Total input files to process : 1
19:48:51.980 INFO MemoryStore - Block broadcast_461 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
19:48:51.980 INFO MemoryStore - Block broadcast_461_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
19:48:51.981 INFO BlockManagerInfo - Added broadcast_461_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:51.981 INFO SparkContext - Created broadcast 461 from broadcast at ReadsSparkSink.java:133
19:48:51.981 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:51.982 INFO MemoryStore - Block broadcast_462 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
19:48:51.982 INFO MemoryStore - Block broadcast_462_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
19:48:51.983 INFO BlockManagerInfo - Added broadcast_462_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:51.983 INFO SparkContext - Created broadcast 462 from broadcast at BamSink.java:76
19:48:51.985 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts dst=null perm=null proto=rpc
19:48:51.985 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:51.985 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:51.985 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:51.986 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:51.992 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:51.992 INFO DAGScheduler - Registering RDD 1104 (mapToPair at SparkUtils.java:161) as input to shuffle 46
19:48:51.992 INFO DAGScheduler - Got job 173 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:51.992 INFO DAGScheduler - Final stage: ResultStage 231 (runJob at SparkHadoopWriter.scala:83)
19:48:51.992 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 230)
19:48:51.992 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 230)
19:48:51.992 INFO DAGScheduler - Submitting ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:52.009 INFO MemoryStore - Block broadcast_463 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
19:48:52.011 INFO MemoryStore - Block broadcast_463_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
19:48:52.011 INFO BlockManagerInfo - Added broadcast_463_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:52.011 INFO SparkContext - Created broadcast 463 from broadcast at DAGScheduler.scala:1580
19:48:52.011 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:52.011 INFO TaskSchedulerImpl - Adding task set 230.0 with 1 tasks resource profile 0
19:48:52.011 INFO TaskSetManager - Starting task 0.0 in stage 230.0 (TID 286) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:52.012 INFO Executor - Running task 0.0 in stage 230.0 (TID 286)
19:48:52.041 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:52.058 INFO Executor - Finished task 0.0 in stage 230.0 (TID 286). 1148 bytes result sent to driver
19:48:52.058 INFO TaskSetManager - Finished task 0.0 in stage 230.0 (TID 286) in 47 ms on localhost (executor driver) (1/1)
19:48:52.058 INFO TaskSchedulerImpl - Removed TaskSet 230.0, whose tasks have all completed, from pool
19:48:52.059 INFO DAGScheduler - ShuffleMapStage 230 (mapToPair at SparkUtils.java:161) finished in 0.065 s
19:48:52.059 INFO DAGScheduler - looking for newly runnable stages
19:48:52.059 INFO DAGScheduler - running: HashSet()
19:48:52.059 INFO DAGScheduler - waiting: HashSet(ResultStage 231)
19:48:52.059 INFO DAGScheduler - failed: HashSet()
19:48:52.059 INFO DAGScheduler - Submitting ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91), which has no missing parents
19:48:52.067 INFO MemoryStore - Block broadcast_464 stored as values in memory (estimated size 241.5 KiB, free 1915.7 MiB)
19:48:52.067 INFO MemoryStore - Block broadcast_464_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.6 MiB)
19:48:52.067 INFO BlockManagerInfo - Added broadcast_464_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.1 MiB)
19:48:52.068 INFO SparkContext - Created broadcast 464 from broadcast at DAGScheduler.scala:1580
19:48:52.068 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:52.068 INFO TaskSchedulerImpl - Adding task set 231.0 with 1 tasks resource profile 0
19:48:52.068 INFO TaskSetManager - Starting task 0.0 in stage 231.0 (TID 287) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:52.068 INFO Executor - Running task 0.0 in stage 231.0 (TID 287)
19:48:52.072 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:52.073 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:52.083 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:52.083 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:52.083 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:52.083 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:52.083 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:52.083 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:52.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:52.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:52.087 INFO StateChange - BLOCK* allocate blk_1073741890_1066, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/part-r-00000
19:48:52.088 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741890_1066 src: /127.0.0.1:53956 dest: /127.0.0.1:45925
19:48:52.090 INFO clienttrace - src: /127.0.0.1:53956, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741890_1066, duration(ns): 1012356
19:48:52.090 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741890_1066, type=LAST_IN_PIPELINE terminating
19:48:52.091 INFO FSNamesystem - BLOCK* blk_1073741890_1066 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/part-r-00000
19:48:52.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741880_1056 replica FinalizedReplica, blk_1073741880_1056, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741880 for deletion
19:48:52.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741886_1062 replica FinalizedReplica, blk_1073741886_1062, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741886 for deletion
19:48:52.414 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741880_1056 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741880
19:48:52.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741886_1062 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741886
19:48:52.491 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:52.492 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:52.492 INFO StateChange - BLOCK* allocate blk_1073741891_1067, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/.part-r-00000.sbi
19:48:52.493 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741891_1067 src: /127.0.0.1:53962 dest: /127.0.0.1:45925
19:48:52.494 INFO clienttrace - src: /127.0.0.1:53962, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741891_1067, duration(ns): 371938
19:48:52.494 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741891_1067, type=LAST_IN_PIPELINE terminating
19:48:52.494 INFO FSNamesystem - BLOCK* blk_1073741891_1067 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/.part-r-00000.sbi
19:48:52.895 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:52.896 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0 dst=null perm=null proto=rpc
19:48:52.896 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0 dst=null perm=null proto=rpc
19:48:52.897 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/task_202507151948517797929628026253941_1109_r_000000 dst=null perm=null proto=rpc
19:48:52.897 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/_temporary/attempt_202507151948517797929628026253941_1109_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/task_202507151948517797929628026253941_1109_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:52.897 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948517797929628026253941_1109_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/task_202507151948517797929628026253941_1109_r_000000
19:48:52.897 INFO SparkHadoopMapRedUtil - attempt_202507151948517797929628026253941_1109_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:52.898 INFO Executor - Finished task 0.0 in stage 231.0 (TID 287). 1858 bytes result sent to driver
19:48:52.898 INFO TaskSetManager - Finished task 0.0 in stage 231.0 (TID 287) in 830 ms on localhost (executor driver) (1/1)
19:48:52.898 INFO TaskSchedulerImpl - Removed TaskSet 231.0, whose tasks have all completed, from pool
19:48:52.898 INFO DAGScheduler - ResultStage 231 (runJob at SparkHadoopWriter.scala:83) finished in 0.839 s
19:48:52.898 INFO DAGScheduler - Job 173 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:52.898 INFO TaskSchedulerImpl - Killing all running tasks in stage 231: Stage finished
19:48:52.898 INFO DAGScheduler - Job 173 finished: runJob at SparkHadoopWriter.scala:83, took 0.906774 s
19:48:52.899 INFO SparkHadoopWriter - Start to commit write Job job_202507151948517797929628026253941_1109.
19:48:52.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:52.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts dst=null perm=null proto=rpc
19:48:52.900 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/task_202507151948517797929628026253941_1109_r_000000 dst=null perm=null proto=rpc
19:48:52.900 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:52.901 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/task_202507151948517797929628026253941_1109_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:52.901 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:52.902 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary/0/task_202507151948517797929628026253941_1109_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:52.902 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:52.903 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:52.903 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:52.904 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/.spark-staging-1109 dst=null perm=null proto=rpc
19:48:52.904 INFO SparkHadoopWriter - Write Job job_202507151948517797929628026253941_1109 committed. Elapsed time: 5 ms.
19:48:52.904 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:52.906 INFO StateChange - BLOCK* allocate blk_1073741892_1068, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/header
19:48:52.907 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741892_1068 src: /127.0.0.1:53970 dest: /127.0.0.1:45925
19:48:52.908 INFO clienttrace - src: /127.0.0.1:53970, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741892_1068, duration(ns): 426328
19:48:52.908 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741892_1068, type=LAST_IN_PIPELINE terminating
19:48:52.908 INFO FSNamesystem - BLOCK* blk_1073741892_1068 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/header
19:48:53.309 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:53.310 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:53.310 INFO StateChange - BLOCK* allocate blk_1073741893_1069, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/terminator
19:48:53.311 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741893_1069 src: /127.0.0.1:53972 dest: /127.0.0.1:45925
19:48:53.312 INFO clienttrace - src: /127.0.0.1:53972, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741893_1069, duration(ns): 392745
19:48:53.312 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741893_1069, type=LAST_IN_PIPELINE terminating
19:48:53.313 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:53.313 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts dst=null perm=null proto=rpc
19:48:53.314 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:53.314 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:53.314 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam
19:48:53.315 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:53.315 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.316 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.316 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:53.316 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam done
19:48:53.316 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.317 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi
19:48:53.317 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts dst=null perm=null proto=rpc
19:48:53.317 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:53.318 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:53.318 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:53.319 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:53.320 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:53.320 INFO StateChange - BLOCK* allocate blk_1073741894_1070, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi
19:48:53.321 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741894_1070 src: /127.0.0.1:53976 dest: /127.0.0.1:45925
19:48:53.322 INFO clienttrace - src: /127.0.0.1:53976, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741894_1070, duration(ns): 371416
19:48:53.322 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741894_1070, type=LAST_IN_PIPELINE terminating
19:48:53.322 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:53.323 INFO IndexFileMerger - Done merging .sbi files
19:48:53.323 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.parts dst=null perm=null proto=rpc
19:48:53.332 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi dst=null perm=null proto=rpc
19:48:53.332 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi dst=null perm=null proto=rpc
19:48:53.332 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi dst=null perm=null proto=rpc
19:48:53.333 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:53.334 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.334 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.334 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.335 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.335 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.bai dst=null perm=null proto=rpc
19:48:53.336 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bai dst=null perm=null proto=rpc
19:48:53.337 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:53.342 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi dst=null perm=null proto=rpc
19:48:53.342 INFO BlockManagerInfo - Removed broadcast_456_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:53.343 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi dst=null perm=null proto=rpc
19:48:53.343 INFO BlockManagerInfo - Removed broadcast_462_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:48:53.343 INFO BlockManagerInfo - Removed broadcast_460_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:53.344 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.sbi dst=null perm=null proto=rpc
19:48:53.344 INFO BlockManagerInfo - Removed broadcast_457_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:53.345 INFO BlockManagerInfo - Removed broadcast_463_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.6 MiB)
19:48:53.345 INFO MemoryStore - Block broadcast_465 stored as values in memory (estimated size 320.0 B, free 1917.9 MiB)
19:48:53.345 INFO BlockManagerInfo - Removed broadcast_464_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.7 MiB)
19:48:53.345 INFO MemoryStore - Block broadcast_465_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.2 MiB)
19:48:53.346 INFO BlockManagerInfo - Added broadcast_465_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:48:53.346 INFO SparkContext - Created broadcast 465 from broadcast at BamSource.java:104
19:48:53.346 INFO BlockManagerInfo - Removed broadcast_455_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:53.347 INFO BlockManagerInfo - Removed broadcast_458_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.9 MiB)
19:48:53.347 INFO MemoryStore - Block broadcast_466 stored as values in memory (estimated size 297.9 KiB, free 1918.9 MiB)
19:48:53.348 INFO BlockManagerInfo - Removed broadcast_449_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:53.348 INFO BlockManagerInfo - Removed broadcast_461_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1920.0 MiB)
19:48:53.354 INFO MemoryStore - Block broadcast_466_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:53.354 INFO BlockManagerInfo - Added broadcast_466_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:53.354 INFO SparkContext - Created broadcast 466 from newAPIHadoopFile at PathSplitSource.java:96
19:48:53.362 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.363 INFO FileInputFormat - Total input files to process : 1
19:48:53.363 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.378 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:53.378 INFO DAGScheduler - Got job 174 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:53.378 INFO DAGScheduler - Final stage: ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:53.378 INFO DAGScheduler - Parents of final stage: List()
19:48:53.378 INFO DAGScheduler - Missing parents: List()
19:48:53.378 INFO DAGScheduler - Submitting ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:53.384 INFO MemoryStore - Block broadcast_467 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:53.384 INFO MemoryStore - Block broadcast_467_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:53.385 INFO BlockManagerInfo - Added broadcast_467_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:53.385 INFO SparkContext - Created broadcast 467 from broadcast at DAGScheduler.scala:1580
19:48:53.385 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:53.385 INFO TaskSchedulerImpl - Adding task set 232.0 with 1 tasks resource profile 0
19:48:53.385 INFO TaskSetManager - Starting task 0.0 in stage 232.0 (TID 288) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:53.386 INFO Executor - Running task 0.0 in stage 232.0 (TID 288)
19:48:53.397 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam:0+237038
19:48:53.397 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.398 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.399 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.bai dst=null perm=null proto=rpc
19:48:53.399 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bai dst=null perm=null proto=rpc
19:48:53.400 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:53.402 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:53.402 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:53.404 INFO Executor - Finished task 0.0 in stage 232.0 (TID 288). 651483 bytes result sent to driver
19:48:53.405 INFO TaskSetManager - Finished task 0.0 in stage 232.0 (TID 288) in 20 ms on localhost (executor driver) (1/1)
19:48:53.405 INFO TaskSchedulerImpl - Removed TaskSet 232.0, whose tasks have all completed, from pool
19:48:53.405 INFO DAGScheduler - ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
19:48:53.405 INFO DAGScheduler - Job 174 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:53.405 INFO TaskSchedulerImpl - Killing all running tasks in stage 232: Stage finished
19:48:53.406 INFO DAGScheduler - Job 174 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.027860 s
19:48:53.415 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:53.415 INFO DAGScheduler - Got job 175 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:53.415 INFO DAGScheduler - Final stage: ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185)
19:48:53.415 INFO DAGScheduler - Parents of final stage: List()
19:48:53.415 INFO DAGScheduler - Missing parents: List()
19:48:53.415 INFO DAGScheduler - Submitting ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:53.432 INFO MemoryStore - Block broadcast_468 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:53.433 INFO MemoryStore - Block broadcast_468_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
19:48:53.433 INFO BlockManagerInfo - Added broadcast_468_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:53.433 INFO SparkContext - Created broadcast 468 from broadcast at DAGScheduler.scala:1580
19:48:53.433 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:53.433 INFO TaskSchedulerImpl - Adding task set 233.0 with 1 tasks resource profile 0
19:48:53.434 INFO TaskSetManager - Starting task 0.0 in stage 233.0 (TID 289) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:53.434 INFO Executor - Running task 0.0 in stage 233.0 (TID 289)
19:48:53.463 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:53.472 INFO Executor - Finished task 0.0 in stage 233.0 (TID 289). 989 bytes result sent to driver
19:48:53.473 INFO TaskSetManager - Finished task 0.0 in stage 233.0 (TID 289) in 39 ms on localhost (executor driver) (1/1)
19:48:53.473 INFO TaskSchedulerImpl - Removed TaskSet 233.0, whose tasks have all completed, from pool
19:48:53.473 INFO DAGScheduler - ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:48:53.473 INFO DAGScheduler - Job 175 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:53.473 INFO TaskSchedulerImpl - Killing all running tasks in stage 233: Stage finished
19:48:53.473 INFO DAGScheduler - Job 175 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058146 s
19:48:53.477 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:53.478 INFO DAGScheduler - Got job 176 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:53.478 INFO DAGScheduler - Final stage: ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185)
19:48:53.478 INFO DAGScheduler - Parents of final stage: List()
19:48:53.478 INFO DAGScheduler - Missing parents: List()
19:48:53.478 INFO DAGScheduler - Submitting ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:53.484 INFO MemoryStore - Block broadcast_469 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:53.484 INFO MemoryStore - Block broadcast_469_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
19:48:53.484 INFO BlockManagerInfo - Added broadcast_469_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:53.485 INFO SparkContext - Created broadcast 469 from broadcast at DAGScheduler.scala:1580
19:48:53.485 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:53.485 INFO TaskSchedulerImpl - Adding task set 234.0 with 1 tasks resource profile 0
19:48:53.485 INFO TaskSetManager - Starting task 0.0 in stage 234.0 (TID 290) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:53.485 INFO Executor - Running task 0.0 in stage 234.0 (TID 290)
19:48:53.496 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam:0+237038
19:48:53.497 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.497 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam dst=null perm=null proto=rpc
19:48:53.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bam.bai dst=null perm=null proto=rpc
19:48:53.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_126cf7e5-8744-4aa1-977b-b64f02950488.bai dst=null perm=null proto=rpc
19:48:53.500 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:53.501 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:53.502 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:53.504 INFO Executor - Finished task 0.0 in stage 234.0 (TID 290). 989 bytes result sent to driver
19:48:53.504 INFO TaskSetManager - Finished task 0.0 in stage 234.0 (TID 290) in 19 ms on localhost (executor driver) (1/1)
19:48:53.504 INFO TaskSchedulerImpl - Removed TaskSet 234.0, whose tasks have all completed, from pool
19:48:53.504 INFO DAGScheduler - ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
19:48:53.504 INFO DAGScheduler - Job 176 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:53.504 INFO TaskSchedulerImpl - Killing all running tasks in stage 234: Stage finished
19:48:53.504 INFO DAGScheduler - Job 176 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026853 s
19:48:53.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:53.515 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:53.516 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:53.516 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:53.519 INFO MemoryStore - Block broadcast_470 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:53.527 INFO MemoryStore - Block broadcast_470_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:53.527 INFO BlockManagerInfo - Added broadcast_470_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:53.527 INFO SparkContext - Created broadcast 470 from newAPIHadoopFile at PathSplitSource.java:96
19:48:53.548 INFO MemoryStore - Block broadcast_471 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
19:48:53.554 INFO MemoryStore - Block broadcast_471_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:53.554 INFO BlockManagerInfo - Added broadcast_471_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:53.555 INFO SparkContext - Created broadcast 471 from newAPIHadoopFile at PathSplitSource.java:96
19:48:53.574 INFO FileInputFormat - Total input files to process : 1
19:48:53.575 INFO MemoryStore - Block broadcast_472 stored as values in memory (estimated size 160.7 KiB, free 1917.5 MiB)
19:48:53.576 INFO MemoryStore - Block broadcast_472_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
19:48:53.576 INFO BlockManagerInfo - Added broadcast_472_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:53.576 INFO SparkContext - Created broadcast 472 from broadcast at ReadsSparkSink.java:133
19:48:53.577 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:53.577 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
19:48:53.578 INFO MemoryStore - Block broadcast_473 stored as values in memory (estimated size 163.2 KiB, free 1917.4 MiB)
19:48:53.578 INFO MemoryStore - Block broadcast_473_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
19:48:53.578 INFO BlockManagerInfo - Added broadcast_473_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.5 MiB)
19:48:53.578 INFO SparkContext - Created broadcast 473 from broadcast at BamSink.java:76
19:48:53.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts dst=null perm=null proto=rpc
19:48:53.581 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:53.581 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:53.581 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:53.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:53.587 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:53.588 INFO DAGScheduler - Registering RDD 1129 (mapToPair at SparkUtils.java:161) as input to shuffle 47
19:48:53.588 INFO DAGScheduler - Got job 177 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:53.588 INFO DAGScheduler - Final stage: ResultStage 236 (runJob at SparkHadoopWriter.scala:83)
19:48:53.588 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 235)
19:48:53.588 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 235)
19:48:53.588 INFO DAGScheduler - Submitting ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:53.605 INFO MemoryStore - Block broadcast_474 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
19:48:53.606 INFO MemoryStore - Block broadcast_474_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
19:48:53.606 INFO BlockManagerInfo - Added broadcast_474_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.4 MiB)
19:48:53.606 INFO SparkContext - Created broadcast 474 from broadcast at DAGScheduler.scala:1580
19:48:53.606 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:53.606 INFO TaskSchedulerImpl - Adding task set 235.0 with 1 tasks resource profile 0
19:48:53.607 INFO TaskSetManager - Starting task 0.0 in stage 235.0 (TID 291) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:53.607 INFO Executor - Running task 0.0 in stage 235.0 (TID 291)
19:48:53.638 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:53.652 INFO Executor - Finished task 0.0 in stage 235.0 (TID 291). 1148 bytes result sent to driver
19:48:53.653 INFO TaskSetManager - Finished task 0.0 in stage 235.0 (TID 291) in 46 ms on localhost (executor driver) (1/1)
19:48:53.653 INFO TaskSchedulerImpl - Removed TaskSet 235.0, whose tasks have all completed, from pool
19:48:53.653 INFO DAGScheduler - ShuffleMapStage 235 (mapToPair at SparkUtils.java:161) finished in 0.065 s
19:48:53.653 INFO DAGScheduler - looking for newly runnable stages
19:48:53.653 INFO DAGScheduler - running: HashSet()
19:48:53.653 INFO DAGScheduler - waiting: HashSet(ResultStage 236)
19:48:53.653 INFO DAGScheduler - failed: HashSet()
19:48:53.653 INFO DAGScheduler - Submitting ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91), which has no missing parents
19:48:53.660 INFO MemoryStore - Block broadcast_475 stored as values in memory (estimated size 241.5 KiB, free 1916.4 MiB)
19:48:53.660 INFO MemoryStore - Block broadcast_475_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.4 MiB)
19:48:53.660 INFO BlockManagerInfo - Added broadcast_475_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.3 MiB)
19:48:53.661 INFO SparkContext - Created broadcast 475 from broadcast at DAGScheduler.scala:1580
19:48:53.661 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:53.661 INFO TaskSchedulerImpl - Adding task set 236.0 with 1 tasks resource profile 0
19:48:53.661 INFO TaskSetManager - Starting task 0.0 in stage 236.0 (TID 292) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:53.662 INFO Executor - Running task 0.0 in stage 236.0 (TID 292)
19:48:53.666 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:53.666 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:53.677 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:53.677 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:53.677 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:53.677 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:53.677 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:53.677 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:53.678 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:53.680 INFO StateChange - BLOCK* allocate blk_1073741895_1071, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0/part-r-00000
19:48:53.681 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741895_1071 src: /127.0.0.1:53990 dest: /127.0.0.1:45925
19:48:53.682 INFO clienttrace - src: /127.0.0.1:53990, dest: /127.0.0.1:45925, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741895_1071, duration(ns): 837366
19:48:53.682 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741895_1071, type=LAST_IN_PIPELINE terminating
19:48:53.683 INFO FSNamesystem - BLOCK* blk_1073741895_1071 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0/part-r-00000
19:48:54.083 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:54.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:54.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0 dst=null perm=null proto=rpc
19:48:54.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0 dst=null perm=null proto=rpc
19:48:54.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/task_202507151948534982558270871935678_1134_r_000000 dst=null perm=null proto=rpc
19:48:54.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/_temporary/attempt_202507151948534982558270871935678_1134_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/task_202507151948534982558270871935678_1134_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:54.086 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948534982558270871935678_1134_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/task_202507151948534982558270871935678_1134_r_000000
19:48:54.086 INFO SparkHadoopMapRedUtil - attempt_202507151948534982558270871935678_1134_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:54.087 INFO Executor - Finished task 0.0 in stage 236.0 (TID 292). 1858 bytes result sent to driver
19:48:54.087 INFO TaskSetManager - Finished task 0.0 in stage 236.0 (TID 292) in 426 ms on localhost (executor driver) (1/1)
19:48:54.087 INFO TaskSchedulerImpl - Removed TaskSet 236.0, whose tasks have all completed, from pool
19:48:54.087 INFO DAGScheduler - ResultStage 236 (runJob at SparkHadoopWriter.scala:83) finished in 0.434 s
19:48:54.087 INFO DAGScheduler - Job 177 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:54.087 INFO TaskSchedulerImpl - Killing all running tasks in stage 236: Stage finished
19:48:54.087 INFO DAGScheduler - Job 177 finished: runJob at SparkHadoopWriter.scala:83, took 0.500113 s
19:48:54.088 INFO SparkHadoopWriter - Start to commit write Job job_202507151948534982558270871935678_1134.
19:48:54.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:54.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts dst=null perm=null proto=rpc
19:48:54.089 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/task_202507151948534982558270871935678_1134_r_000000 dst=null perm=null proto=rpc
19:48:54.089 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:54.089 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary/0/task_202507151948534982558270871935678_1134_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:54.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.091 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:54.092 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/.spark-staging-1134 dst=null perm=null proto=rpc
19:48:54.092 INFO SparkHadoopWriter - Write Job job_202507151948534982558270871935678_1134 committed. Elapsed time: 4 ms.
19:48:54.092 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.093 INFO StateChange - BLOCK* allocate blk_1073741896_1072, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/header
19:48:54.094 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741896_1072 src: /127.0.0.1:53994 dest: /127.0.0.1:45925
19:48:54.095 INFO clienttrace - src: /127.0.0.1:53994, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741896_1072, duration(ns): 345238
19:48:54.095 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741896_1072, type=LAST_IN_PIPELINE terminating
19:48:54.096 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:54.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.097 INFO StateChange - BLOCK* allocate blk_1073741897_1073, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/terminator
19:48:54.098 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741897_1073 src: /127.0.0.1:54004 dest: /127.0.0.1:45925
19:48:54.099 INFO clienttrace - src: /127.0.0.1:54004, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741897_1073, duration(ns): 381686
19:48:54.099 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741897_1073, type=LAST_IN_PIPELINE terminating
19:48:54.099 INFO FSNamesystem - BLOCK* blk_1073741897_1073 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/terminator
19:48:54.500 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:54.501 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts dst=null perm=null proto=rpc
19:48:54.501 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.502 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:54.502 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam
19:48:54.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.504 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.504 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam done
19:48:54.504 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.parts dst=null perm=null proto=rpc
19:48:54.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.506 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.506 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.bai dst=null perm=null proto=rpc
19:48:54.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bai dst=null perm=null proto=rpc
19:48:54.508 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.510 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.510 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.sbi dst=null perm=null proto=rpc
19:48:54.511 INFO MemoryStore - Block broadcast_476 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
19:48:54.517 INFO BlockManagerInfo - Removed broadcast_465_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.3 MiB)
19:48:54.517 INFO BlockManagerInfo - Removed broadcast_472_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:48:54.518 INFO BlockManagerInfo - Removed broadcast_471_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:54.518 INFO BlockManagerInfo - Removed broadcast_466_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:54.519 INFO BlockManagerInfo - Removed broadcast_474_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.6 MiB)
19:48:54.520 INFO BlockManagerInfo - Removed broadcast_475_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.6 MiB)
19:48:54.520 INFO BlockManagerInfo - Removed broadcast_469_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.7 MiB)
19:48:54.520 INFO BlockManagerInfo - Removed broadcast_459_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.7 MiB)
19:48:54.521 INFO BlockManagerInfo - Removed broadcast_468_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.9 MiB)
19:48:54.521 INFO BlockManagerInfo - Removed broadcast_473_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.9 MiB)
19:48:54.522 INFO BlockManagerInfo - Removed broadcast_467_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1920.0 MiB)
19:48:54.523 INFO MemoryStore - Block broadcast_476_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:54.523 INFO BlockManagerInfo - Added broadcast_476_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:54.524 INFO SparkContext - Created broadcast 476 from newAPIHadoopFile at PathSplitSource.java:96
19:48:54.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.544 INFO FileInputFormat - Total input files to process : 1
19:48:54.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.580 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:54.581 INFO DAGScheduler - Got job 178 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:54.581 INFO DAGScheduler - Final stage: ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:54.581 INFO DAGScheduler - Parents of final stage: List()
19:48:54.581 INFO DAGScheduler - Missing parents: List()
19:48:54.581 INFO DAGScheduler - Submitting ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:54.597 INFO MemoryStore - Block broadcast_477 stored as values in memory (estimated size 426.2 KiB, free 1918.9 MiB)
19:48:54.599 INFO MemoryStore - Block broadcast_477_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1918.8 MiB)
19:48:54.599 INFO BlockManagerInfo - Added broadcast_477_piece0 in memory on localhost:36125 (size: 153.7 KiB, free: 1919.8 MiB)
19:48:54.599 INFO SparkContext - Created broadcast 477 from broadcast at DAGScheduler.scala:1580
19:48:54.599 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:54.599 INFO TaskSchedulerImpl - Adding task set 237.0 with 1 tasks resource profile 0
19:48:54.600 INFO TaskSetManager - Starting task 0.0 in stage 237.0 (TID 293) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:54.600 INFO Executor - Running task 0.0 in stage 237.0 (TID 293)
19:48:54.629 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam:0+237038
19:48:54.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.631 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.632 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.632 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.633 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.633 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.bai dst=null perm=null proto=rpc
19:48:54.633 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bai dst=null perm=null proto=rpc
19:48:54.635 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.636 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.636 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.637 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.642 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.643 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.643 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.644 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.644 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.645 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.645 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.646 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.647 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.647 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.648 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.648 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.649 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.650 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.650 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.651 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.651 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.652 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.652 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.653 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.654 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.655 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.655 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.656 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.657 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.658 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.659 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.659 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.660 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.661 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.661 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.662 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.663 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.664 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.665 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.665 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.666 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.667 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.667 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.668 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.668 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.669 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.670 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.671 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.672 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.673 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.674 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.674 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.675 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.676 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.676 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.677 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.678 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.678 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.679 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.680 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.680 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.681 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.682 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.682 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.683 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.683 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.684 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.685 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.685 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.685 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.686 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.686 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.bai dst=null perm=null proto=rpc
19:48:54.687 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bai dst=null perm=null proto=rpc
19:48:54.688 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.690 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.690 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:54.692 INFO Executor - Finished task 0.0 in stage 237.0 (TID 293). 651483 bytes result sent to driver
19:48:54.694 INFO TaskSetManager - Finished task 0.0 in stage 237.0 (TID 293) in 94 ms on localhost (executor driver) (1/1)
19:48:54.694 INFO TaskSchedulerImpl - Removed TaskSet 237.0, whose tasks have all completed, from pool
19:48:54.694 INFO DAGScheduler - ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.113 s
19:48:54.694 INFO DAGScheduler - Job 178 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:54.694 INFO TaskSchedulerImpl - Killing all running tasks in stage 237: Stage finished
19:48:54.694 INFO DAGScheduler - Job 178 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.113649 s
19:48:54.703 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:54.704 INFO DAGScheduler - Got job 179 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:54.704 INFO DAGScheduler - Final stage: ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185)
19:48:54.704 INFO DAGScheduler - Parents of final stage: List()
19:48:54.704 INFO DAGScheduler - Missing parents: List()
19:48:54.704 INFO DAGScheduler - Submitting ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:54.720 INFO MemoryStore - Block broadcast_478 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
19:48:54.721 INFO MemoryStore - Block broadcast_478_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
19:48:54.721 INFO BlockManagerInfo - Added broadcast_478_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.6 MiB)
19:48:54.722 INFO SparkContext - Created broadcast 478 from broadcast at DAGScheduler.scala:1580
19:48:54.722 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:54.722 INFO TaskSchedulerImpl - Adding task set 238.0 with 1 tasks resource profile 0
19:48:54.722 INFO TaskSetManager - Starting task 0.0 in stage 238.0 (TID 294) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:48:54.722 INFO Executor - Running task 0.0 in stage 238.0 (TID 294)
19:48:54.751 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:54.760 INFO Executor - Finished task 0.0 in stage 238.0 (TID 294). 989 bytes result sent to driver
19:48:54.761 INFO TaskSetManager - Finished task 0.0 in stage 238.0 (TID 294) in 39 ms on localhost (executor driver) (1/1)
19:48:54.761 INFO TaskSchedulerImpl - Removed TaskSet 238.0, whose tasks have all completed, from pool
19:48:54.761 INFO DAGScheduler - ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
19:48:54.761 INFO DAGScheduler - Job 179 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:54.761 INFO TaskSchedulerImpl - Killing all running tasks in stage 238: Stage finished
19:48:54.761 INFO DAGScheduler - Job 179 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057548 s
19:48:54.764 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:54.764 INFO DAGScheduler - Got job 180 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:54.764 INFO DAGScheduler - Final stage: ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185)
19:48:54.764 INFO DAGScheduler - Parents of final stage: List()
19:48:54.764 INFO DAGScheduler - Missing parents: List()
19:48:54.764 INFO DAGScheduler - Submitting ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:54.781 INFO MemoryStore - Block broadcast_479 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
19:48:54.782 INFO MemoryStore - Block broadcast_479_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
19:48:54.782 INFO BlockManagerInfo - Added broadcast_479_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:48:54.782 INFO SparkContext - Created broadcast 479 from broadcast at DAGScheduler.scala:1580
19:48:54.782 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:54.782 INFO TaskSchedulerImpl - Adding task set 239.0 with 1 tasks resource profile 0
19:48:54.783 INFO TaskSetManager - Starting task 0.0 in stage 239.0 (TID 295) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:54.783 INFO Executor - Running task 0.0 in stage 239.0 (TID 295)
19:48:54.810 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam:0+237038
19:48:54.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.812 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.bai dst=null perm=null proto=rpc
19:48:54.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bai dst=null perm=null proto=rpc
19:48:54.815 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.817 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.817 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.821 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.822 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.823 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.824 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.824 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.825 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.826 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.826 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.827 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.828 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.828 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.829 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.830 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.831 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.832 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.832 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.833 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.834 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.834 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.835 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.836 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.836 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.837 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.837 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.838 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.839 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.840 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.840 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.841 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.842 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.842 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.843 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.844 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.844 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.845 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.846 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.847 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.848 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.848 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.849 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.850 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.851 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.852 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.853 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.853 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.854 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.855 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.855 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.856 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.857 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.857 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.858 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.859 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.860 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.860 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.861 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.863 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.864 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
19:48:54.865 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.866 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam dst=null perm=null proto=rpc
19:48:54.867 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bam.bai dst=null perm=null proto=rpc
19:48:54.867 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_7ffc5ca7-70d7-4cf1-8f04-3f05503999f6.bai dst=null perm=null proto=rpc
19:48:54.868 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:54.870 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
19:48:54.872 INFO Executor - Finished task 0.0 in stage 239.0 (TID 295). 989 bytes result sent to driver
19:48:54.872 INFO TaskSetManager - Finished task 0.0 in stage 239.0 (TID 295) in 89 ms on localhost (executor driver) (1/1)
19:48:54.872 INFO TaskSchedulerImpl - Removed TaskSet 239.0, whose tasks have all completed, from pool
19:48:54.872 INFO DAGScheduler - ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.107 s
19:48:54.872 INFO DAGScheduler - Job 180 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:54.872 INFO TaskSchedulerImpl - Killing all running tasks in stage 239: Stage finished
19:48:54.873 INFO DAGScheduler - Job 180 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.108468 s
19:48:54.881 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:54.882 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:54.882 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:54.883 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:54.885 INFO MemoryStore - Block broadcast_480 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
19:48:54.892 INFO MemoryStore - Block broadcast_480_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
19:48:54.892 INFO BlockManagerInfo - Added broadcast_480_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.4 MiB)
19:48:54.892 INFO SparkContext - Created broadcast 480 from newAPIHadoopFile at PathSplitSource.java:96
19:48:54.913 INFO MemoryStore - Block broadcast_481 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
19:48:54.919 INFO MemoryStore - Block broadcast_481_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.9 MiB)
19:48:54.919 INFO BlockManagerInfo - Added broadcast_481_piece0 in memory on localhost:36125 (size: 50.3 KiB, free: 1919.4 MiB)
19:48:54.919 INFO SparkContext - Created broadcast 481 from newAPIHadoopFile at PathSplitSource.java:96
19:48:54.939 INFO FileInputFormat - Total input files to process : 1
19:48:54.941 INFO MemoryStore - Block broadcast_482 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
19:48:54.942 INFO MemoryStore - Block broadcast_482_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
19:48:54.942 INFO BlockManagerInfo - Added broadcast_482_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:54.942 INFO SparkContext - Created broadcast 482 from broadcast at ReadsSparkSink.java:133
19:48:54.943 INFO MemoryStore - Block broadcast_483 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
19:48:54.944 INFO MemoryStore - Block broadcast_483_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
19:48:54.944 INFO BlockManagerInfo - Added broadcast_483_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:54.944 INFO SparkContext - Created broadcast 483 from broadcast at BamSink.java:76
19:48:54.946 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts dst=null perm=null proto=rpc
19:48:54.946 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:54.946 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:54.946 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:54.947 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:54.953 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:54.953 INFO DAGScheduler - Registering RDD 1155 (mapToPair at SparkUtils.java:161) as input to shuffle 48
19:48:54.953 INFO DAGScheduler - Got job 181 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:54.953 INFO DAGScheduler - Final stage: ResultStage 241 (runJob at SparkHadoopWriter.scala:83)
19:48:54.953 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 240)
19:48:54.954 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 240)
19:48:54.954 INFO DAGScheduler - Submitting ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:54.971 INFO MemoryStore - Block broadcast_484 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
19:48:54.973 INFO MemoryStore - Block broadcast_484_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
19:48:54.973 INFO BlockManagerInfo - Added broadcast_484_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:54.973 INFO SparkContext - Created broadcast 484 from broadcast at DAGScheduler.scala:1580
19:48:54.973 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:54.973 INFO TaskSchedulerImpl - Adding task set 240.0 with 1 tasks resource profile 0
19:48:54.974 INFO TaskSetManager - Starting task 0.0 in stage 240.0 (TID 296) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
19:48:54.974 INFO Executor - Running task 0.0 in stage 240.0 (TID 296)
19:48:55.003 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:55.019 INFO Executor - Finished task 0.0 in stage 240.0 (TID 296). 1148 bytes result sent to driver
19:48:55.019 INFO TaskSetManager - Finished task 0.0 in stage 240.0 (TID 296) in 45 ms on localhost (executor driver) (1/1)
19:48:55.019 INFO TaskSchedulerImpl - Removed TaskSet 240.0, whose tasks have all completed, from pool
19:48:55.020 INFO DAGScheduler - ShuffleMapStage 240 (mapToPair at SparkUtils.java:161) finished in 0.066 s
19:48:55.020 INFO DAGScheduler - looking for newly runnable stages
19:48:55.020 INFO DAGScheduler - running: HashSet()
19:48:55.020 INFO DAGScheduler - waiting: HashSet(ResultStage 241)
19:48:55.020 INFO DAGScheduler - failed: HashSet()
19:48:55.020 INFO DAGScheduler - Submitting ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91), which has no missing parents
19:48:55.026 INFO MemoryStore - Block broadcast_485 stored as values in memory (estimated size 241.5 KiB, free 1915.7 MiB)
19:48:55.027 INFO MemoryStore - Block broadcast_485_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.6 MiB)
19:48:55.027 INFO BlockManagerInfo - Added broadcast_485_piece0 in memory on localhost:36125 (size: 67.1 KiB, free: 1919.1 MiB)
19:48:55.027 INFO SparkContext - Created broadcast 485 from broadcast at DAGScheduler.scala:1580
19:48:55.028 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:55.028 INFO TaskSchedulerImpl - Adding task set 241.0 with 1 tasks resource profile 0
19:48:55.028 INFO TaskSetManager - Starting task 0.0 in stage 241.0 (TID 297) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:55.028 INFO Executor - Running task 0.0 in stage 241.0 (TID 297)
19:48:55.032 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:55.032 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:55.042 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:55.042 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:55.042 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:55.042 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:55.042 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:55.042 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:55.044 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:55.044 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:55.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:55.047 INFO StateChange - BLOCK* allocate blk_1073741898_1074, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/part-r-00000
19:48:55.048 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741898_1074 src: /127.0.0.1:54702 dest: /127.0.0.1:45925
19:48:55.050 INFO clienttrace - src: /127.0.0.1:54702, dest: /127.0.0.1:45925, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741898_1074, duration(ns): 915155
19:48:55.050 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741898_1074, type=LAST_IN_PIPELINE terminating
19:48:55.050 INFO FSNamesystem - BLOCK* blk_1073741898_1074 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/part-r-00000
19:48:55.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741891_1067 replica FinalizedReplica, blk_1073741891_1067, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741891 for deletion
19:48:55.414 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741891_1067 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741891
19:48:55.451 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:55.451 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:55.452 INFO StateChange - BLOCK* allocate blk_1073741899_1075, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.sbi
19:48:55.453 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741899_1075 src: /127.0.0.1:54706 dest: /127.0.0.1:45925
19:48:55.454 INFO clienttrace - src: /127.0.0.1:54706, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741899_1075, duration(ns): 383176
19:48:55.454 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741899_1075, type=LAST_IN_PIPELINE terminating
19:48:55.454 INFO FSNamesystem - BLOCK* blk_1073741899_1075 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.sbi
19:48:55.855 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:55.860 INFO BlockManagerInfo - Removed broadcast_478_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.3 MiB)
19:48:55.861 INFO BlockManagerInfo - Removed broadcast_481_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.3 MiB)
19:48:55.861 INFO BlockManagerInfo - Removed broadcast_476_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:55.861 INFO StateChange - BLOCK* allocate blk_1073741900_1076, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.bai
19:48:55.862 INFO BlockManagerInfo - Removed broadcast_479_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.5 MiB)
19:48:55.862 INFO BlockManagerInfo - Removed broadcast_477_piece0 on localhost:36125 in memory (size: 153.7 KiB, free: 1919.7 MiB)
19:48:55.862 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741900_1076 src: /127.0.0.1:54722 dest: /127.0.0.1:45925
19:48:55.862 INFO BlockManagerInfo - Removed broadcast_484_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.8 MiB)
19:48:55.863 INFO BlockManagerInfo - Removed broadcast_470_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:55.864 INFO clienttrace - src: /127.0.0.1:54722, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741900_1076, duration(ns): 450790
19:48:55.864 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741900_1076, type=LAST_IN_PIPELINE terminating
19:48:55.864 INFO FSNamesystem - BLOCK* blk_1073741900_1076 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.bai
19:48:56.265 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0 dst=null perm=null proto=rpc
19:48:56.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0 dst=null perm=null proto=rpc
19:48:56.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/task_202507151948548932130328626631660_1160_r_000000 dst=null perm=null proto=rpc
19:48:56.267 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/_temporary/attempt_202507151948548932130328626631660_1160_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/task_202507151948548932130328626631660_1160_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:56.267 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948548932130328626631660_1160_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/task_202507151948548932130328626631660_1160_r_000000
19:48:56.267 INFO SparkHadoopMapRedUtil - attempt_202507151948548932130328626631660_1160_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:56.268 INFO Executor - Finished task 0.0 in stage 241.0 (TID 297). 1901 bytes result sent to driver
19:48:56.268 INFO TaskSetManager - Finished task 0.0 in stage 241.0 (TID 297) in 1240 ms on localhost (executor driver) (1/1)
19:48:56.268 INFO TaskSchedulerImpl - Removed TaskSet 241.0, whose tasks have all completed, from pool
19:48:56.268 INFO DAGScheduler - ResultStage 241 (runJob at SparkHadoopWriter.scala:83) finished in 1.248 s
19:48:56.268 INFO DAGScheduler - Job 181 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:56.268 INFO TaskSchedulerImpl - Killing all running tasks in stage 241: Stage finished
19:48:56.268 INFO DAGScheduler - Job 181 finished: runJob at SparkHadoopWriter.scala:83, took 1.315458 s
19:48:56.269 INFO SparkHadoopWriter - Start to commit write Job job_202507151948548932130328626631660_1160.
19:48:56.269 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:56.269 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts dst=null perm=null proto=rpc
19:48:56.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/task_202507151948548932130328626631660_1160_r_000000 dst=null perm=null proto=rpc
19:48:56.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:56.271 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/task_202507151948548932130328626631660_1160_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.271 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:56.271 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/task_202507151948548932130328626631660_1160_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.272 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:56.272 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary/0/task_202507151948548932130328626631660_1160_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:56.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.274 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.274 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.spark-staging-1160 dst=null perm=null proto=rpc
19:48:56.274 INFO SparkHadoopWriter - Write Job job_202507151948548932130328626631660_1160 committed. Elapsed time: 5 ms.
19:48:56.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.276 INFO StateChange - BLOCK* allocate blk_1073741901_1077, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/header
19:48:56.277 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741901_1077 src: /127.0.0.1:54730 dest: /127.0.0.1:45925
19:48:56.278 INFO clienttrace - src: /127.0.0.1:54730, dest: /127.0.0.1:45925, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741901_1077, duration(ns): 424265
19:48:56.278 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741901_1077, type=LAST_IN_PIPELINE terminating
19:48:56.278 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.279 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.280 INFO StateChange - BLOCK* allocate blk_1073741902_1078, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/terminator
19:48:56.280 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741902_1078 src: /127.0.0.1:54740 dest: /127.0.0.1:45925
19:48:56.281 INFO clienttrace - src: /127.0.0.1:54740, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741902_1078, duration(ns): 319932
19:48:56.281 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741902_1078, type=LAST_IN_PIPELINE terminating
19:48:56.282 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.282 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts dst=null perm=null proto=rpc
19:48:56.283 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.283 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.283 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam
19:48:56.284 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.284 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.285 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.285 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.285 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam done
19:48:56.286 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.286 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi
19:48:56.286 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts dst=null perm=null proto=rpc
19:48:56.286 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.287 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:56.288 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:56.289 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:56.289 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:56.289 INFO StateChange - BLOCK* allocate blk_1073741903_1079, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi
19:48:56.290 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741903_1079 src: /127.0.0.1:54746 dest: /127.0.0.1:45925
19:48:56.291 INFO clienttrace - src: /127.0.0.1:54746, dest: /127.0.0.1:45925, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741903_1079, duration(ns): 334537
19:48:56.291 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741903_1079, type=LAST_IN_PIPELINE terminating
19:48:56.291 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.292 INFO IndexFileMerger - Done merging .sbi files
19:48:56.292 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai
19:48:56.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts dst=null perm=null proto=rpc
19:48:56.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:56.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:56.294 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:56.294 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:56.295 INFO StateChange - BLOCK* allocate blk_1073741904_1080, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai
19:48:56.296 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741904_1080 src: /127.0.0.1:54758 dest: /127.0.0.1:45925
19:48:56.297 INFO clienttrace - src: /127.0.0.1:54758, dest: /127.0.0.1:45925, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741904_1080, duration(ns): 325249
19:48:56.297 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741904_1080, type=LAST_IN_PIPELINE terminating
19:48:56.297 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.297 INFO IndexFileMerger - Done merging .bai files
19:48:56.298 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.parts dst=null perm=null proto=rpc
19:48:56.307 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.315 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi dst=null perm=null proto=rpc
19:48:56.315 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi dst=null perm=null proto=rpc
19:48:56.315 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi dst=null perm=null proto=rpc
19:48:56.316 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:56.316 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.316 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.317 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.317 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.318 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.318 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.318 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.319 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:56.321 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:56.321 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:56.321 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
19:48:56.321 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi dst=null perm=null proto=rpc
19:48:56.322 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi dst=null perm=null proto=rpc
19:48:56.323 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.sbi dst=null perm=null proto=rpc
19:48:56.323 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
19:48:56.323 INFO MemoryStore - Block broadcast_486 stored as values in memory (estimated size 320.0 B, free 1919.0 MiB)
19:48:56.324 INFO MemoryStore - Block broadcast_486_piece0 stored as bytes in memory (estimated size 233.0 B, free 1919.0 MiB)
19:48:56.324 INFO BlockManagerInfo - Added broadcast_486_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.9 MiB)
19:48:56.324 INFO SparkContext - Created broadcast 486 from broadcast at BamSource.java:104
19:48:56.325 INFO MemoryStore - Block broadcast_487 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
19:48:56.331 INFO MemoryStore - Block broadcast_487_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
19:48:56.331 INFO BlockManagerInfo - Added broadcast_487_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.8 MiB)
19:48:56.331 INFO SparkContext - Created broadcast 487 from newAPIHadoopFile at PathSplitSource.java:96
19:48:56.340 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.340 INFO FileInputFormat - Total input files to process : 1
19:48:56.341 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.355 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:56.355 INFO DAGScheduler - Got job 182 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:56.355 INFO DAGScheduler - Final stage: ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:56.355 INFO DAGScheduler - Parents of final stage: List()
19:48:56.355 INFO DAGScheduler - Missing parents: List()
19:48:56.356 INFO DAGScheduler - Submitting ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:56.362 INFO MemoryStore - Block broadcast_488 stored as values in memory (estimated size 148.2 KiB, free 1918.5 MiB)
19:48:56.362 INFO MemoryStore - Block broadcast_488_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.5 MiB)
19:48:56.362 INFO BlockManagerInfo - Added broadcast_488_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:56.362 INFO SparkContext - Created broadcast 488 from broadcast at DAGScheduler.scala:1580
19:48:56.363 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:56.363 INFO TaskSchedulerImpl - Adding task set 242.0 with 1 tasks resource profile 0
19:48:56.363 INFO TaskSetManager - Starting task 0.0 in stage 242.0 (TID 298) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:56.363 INFO Executor - Running task 0.0 in stage 242.0 (TID 298)
19:48:56.374 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam:0+235514
19:48:56.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.376 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.378 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:56.380 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:56.380 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:56.381 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
19:48:56.382 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:56.384 INFO Executor - Finished task 0.0 in stage 242.0 (TID 298). 650141 bytes result sent to driver
19:48:56.385 INFO TaskSetManager - Finished task 0.0 in stage 242.0 (TID 298) in 22 ms on localhost (executor driver) (1/1)
19:48:56.385 INFO TaskSchedulerImpl - Removed TaskSet 242.0, whose tasks have all completed, from pool
19:48:56.385 INFO DAGScheduler - ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
19:48:56.385 INFO DAGScheduler - Job 182 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:56.385 INFO TaskSchedulerImpl - Killing all running tasks in stage 242: Stage finished
19:48:56.385 INFO DAGScheduler - Job 182 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.030239 s
19:48:56.395 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:56.395 INFO DAGScheduler - Got job 183 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:56.395 INFO DAGScheduler - Final stage: ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185)
19:48:56.395 INFO DAGScheduler - Parents of final stage: List()
19:48:56.395 INFO DAGScheduler - Missing parents: List()
19:48:56.395 INFO DAGScheduler - Submitting ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:56.411 INFO MemoryStore - Block broadcast_489 stored as values in memory (estimated size 426.1 KiB, free 1918.1 MiB)
19:48:56.413 INFO MemoryStore - Block broadcast_489_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.9 MiB)
19:48:56.413 INFO BlockManagerInfo - Added broadcast_489_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.6 MiB)
19:48:56.413 INFO SparkContext - Created broadcast 489 from broadcast at DAGScheduler.scala:1580
19:48:56.413 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:56.413 INFO TaskSchedulerImpl - Adding task set 243.0 with 1 tasks resource profile 0
19:48:56.414 INFO TaskSetManager - Starting task 0.0 in stage 243.0 (TID 299) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
19:48:56.414 INFO Executor - Running task 0.0 in stage 243.0 (TID 299)
19:48:56.443 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
19:48:56.454 INFO Executor - Finished task 0.0 in stage 243.0 (TID 299). 989 bytes result sent to driver
19:48:56.454 INFO TaskSetManager - Finished task 0.0 in stage 243.0 (TID 299) in 41 ms on localhost (executor driver) (1/1)
19:48:56.454 INFO TaskSchedulerImpl - Removed TaskSet 243.0, whose tasks have all completed, from pool
19:48:56.454 INFO DAGScheduler - ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
19:48:56.454 INFO DAGScheduler - Job 183 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:56.454 INFO TaskSchedulerImpl - Killing all running tasks in stage 243: Stage finished
19:48:56.454 INFO DAGScheduler - Job 183 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059711 s
19:48:56.458 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:56.458 INFO DAGScheduler - Got job 184 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:56.458 INFO DAGScheduler - Final stage: ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185)
19:48:56.458 INFO DAGScheduler - Parents of final stage: List()
19:48:56.458 INFO DAGScheduler - Missing parents: List()
19:48:56.458 INFO DAGScheduler - Submitting ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:56.468 INFO MemoryStore - Block broadcast_490 stored as values in memory (estimated size 148.1 KiB, free 1917.8 MiB)
19:48:56.469 INFO MemoryStore - Block broadcast_490_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.7 MiB)
19:48:56.469 INFO BlockManagerInfo - Added broadcast_490_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:56.469 INFO SparkContext - Created broadcast 490 from broadcast at DAGScheduler.scala:1580
19:48:56.469 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:56.469 INFO TaskSchedulerImpl - Adding task set 244.0 with 1 tasks resource profile 0
19:48:56.470 INFO TaskSetManager - Starting task 0.0 in stage 244.0 (TID 300) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:56.470 INFO Executor - Running task 0.0 in stage 244.0 (TID 300)
19:48:56.481 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam:0+235514
19:48:56.481 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam dst=null perm=null proto=rpc
19:48:56.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.483 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.483 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_867cbd50-361b-4c2f-8230-141d3c4f59e9.bam.bai dst=null perm=null proto=rpc
19:48:56.485 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
19:48:56.486 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:56.486 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
19:48:56.487 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
19:48:56.487 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:56.489 INFO Executor - Finished task 0.0 in stage 244.0 (TID 300). 989 bytes result sent to driver
19:48:56.489 INFO TaskSetManager - Finished task 0.0 in stage 244.0 (TID 300) in 20 ms on localhost (executor driver) (1/1)
19:48:56.489 INFO TaskSchedulerImpl - Removed TaskSet 244.0, whose tasks have all completed, from pool
19:48:56.489 INFO DAGScheduler - ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.031 s
19:48:56.489 INFO DAGScheduler - Job 184 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:56.489 INFO TaskSchedulerImpl - Killing all running tasks in stage 244: Stage finished
19:48:56.489 INFO DAGScheduler - Job 184 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.031579 s
19:48:56.497 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:56.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.498 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:56.499 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:56.501 INFO MemoryStore - Block broadcast_491 stored as values in memory (estimated size 298.0 KiB, free 1917.4 MiB)
19:48:56.507 INFO MemoryStore - Block broadcast_491_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.4 MiB)
19:48:56.507 INFO BlockManagerInfo - Added broadcast_491_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:56.507 INFO SparkContext - Created broadcast 491 from newAPIHadoopFile at PathSplitSource.java:96
19:48:56.527 INFO MemoryStore - Block broadcast_492 stored as values in memory (estimated size 298.0 KiB, free 1917.1 MiB)
19:48:56.533 INFO MemoryStore - Block broadcast_492_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
19:48:56.533 INFO BlockManagerInfo - Added broadcast_492_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:56.534 INFO SparkContext - Created broadcast 492 from newAPIHadoopFile at PathSplitSource.java:96
19:48:56.553 INFO FileInputFormat - Total input files to process : 1
19:48:56.554 INFO MemoryStore - Block broadcast_493 stored as values in memory (estimated size 19.6 KiB, free 1917.0 MiB)
19:48:56.554 INFO MemoryStore - Block broadcast_493_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.0 MiB)
19:48:56.554 INFO BlockManagerInfo - Added broadcast_493_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.5 MiB)
19:48:56.555 INFO SparkContext - Created broadcast 493 from broadcast at ReadsSparkSink.java:133
19:48:56.555 INFO MemoryStore - Block broadcast_494 stored as values in memory (estimated size 20.0 KiB, free 1917.0 MiB)
19:48:56.556 INFO MemoryStore - Block broadcast_494_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.0 MiB)
19:48:56.556 INFO BlockManagerInfo - Added broadcast_494_piece0 in memory on localhost:36125 (size: 1890.0 B, free: 1919.5 MiB)
19:48:56.556 INFO SparkContext - Created broadcast 494 from broadcast at BamSink.java:76
19:48:56.557 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts dst=null perm=null proto=rpc
19:48:56.558 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:56.558 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:56.558 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:56.559 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:56.564 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:56.565 INFO DAGScheduler - Registering RDD 1180 (mapToPair at SparkUtils.java:161) as input to shuffle 49
19:48:56.565 INFO DAGScheduler - Got job 185 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:56.565 INFO DAGScheduler - Final stage: ResultStage 246 (runJob at SparkHadoopWriter.scala:83)
19:48:56.565 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 245)
19:48:56.565 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 245)
19:48:56.565 INFO DAGScheduler - Submitting ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:56.581 INFO MemoryStore - Block broadcast_495 stored as values in memory (estimated size 434.3 KiB, free 1916.6 MiB)
19:48:56.583 INFO MemoryStore - Block broadcast_495_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1916.4 MiB)
19:48:56.583 INFO BlockManagerInfo - Added broadcast_495_piece0 in memory on localhost:36125 (size: 157.6 KiB, free: 1919.3 MiB)
19:48:56.583 INFO SparkContext - Created broadcast 495 from broadcast at DAGScheduler.scala:1580
19:48:56.583 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:56.583 INFO TaskSchedulerImpl - Adding task set 245.0 with 1 tasks resource profile 0
19:48:56.584 INFO TaskSetManager - Starting task 0.0 in stage 245.0 (TID 301) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
19:48:56.584 INFO Executor - Running task 0.0 in stage 245.0 (TID 301)
19:48:56.613 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:56.624 INFO Executor - Finished task 0.0 in stage 245.0 (TID 301). 1148 bytes result sent to driver
19:48:56.625 INFO TaskSetManager - Finished task 0.0 in stage 245.0 (TID 301) in 42 ms on localhost (executor driver) (1/1)
19:48:56.625 INFO TaskSchedulerImpl - Removed TaskSet 245.0, whose tasks have all completed, from pool
19:48:56.625 INFO DAGScheduler - ShuffleMapStage 245 (mapToPair at SparkUtils.java:161) finished in 0.060 s
19:48:56.625 INFO DAGScheduler - looking for newly runnable stages
19:48:56.625 INFO DAGScheduler - running: HashSet()
19:48:56.625 INFO DAGScheduler - waiting: HashSet(ResultStage 246)
19:48:56.625 INFO DAGScheduler - failed: HashSet()
19:48:56.625 INFO DAGScheduler - Submitting ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91), which has no missing parents
19:48:56.631 INFO MemoryStore - Block broadcast_496 stored as values in memory (estimated size 155.4 KiB, free 1916.3 MiB)
19:48:56.632 INFO MemoryStore - Block broadcast_496_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1916.2 MiB)
19:48:56.632 INFO BlockManagerInfo - Added broadcast_496_piece0 in memory on localhost:36125 (size: 58.5 KiB, free: 1919.2 MiB)
19:48:56.632 INFO SparkContext - Created broadcast 496 from broadcast at DAGScheduler.scala:1580
19:48:56.632 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:48:56.632 INFO TaskSchedulerImpl - Adding task set 246.0 with 1 tasks resource profile 0
19:48:56.633 INFO TaskSetManager - Starting task 0.0 in stage 246.0 (TID 302) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:56.633 INFO Executor - Running task 0.0 in stage 246.0 (TID 302)
19:48:56.636 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:56.636 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:56.647 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:56.647 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:56.647 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:56.647 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:56.647 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:56.647 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:56.648 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.649 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:56.651 INFO StateChange - BLOCK* allocate blk_1073741905_1081, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/part-r-00000
19:48:56.652 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741905_1081 src: /127.0.0.1:54766 dest: /127.0.0.1:45925
19:48:56.654 INFO clienttrace - src: /127.0.0.1:54766, dest: /127.0.0.1:45925, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741905_1081, duration(ns): 1048184
19:48:56.654 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741905_1081, type=LAST_IN_PIPELINE terminating
19:48:56.655 INFO FSNamesystem - BLOCK* blk_1073741905_1081 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/part-r-00000
19:48:57.055 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.056 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
19:48:57.056 INFO StateChange - BLOCK* allocate blk_1073741906_1082, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/.part-r-00000.sbi
19:48:57.057 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741906_1082 src: /127.0.0.1:54780 dest: /127.0.0.1:45925
19:48:57.058 INFO clienttrace - src: /127.0.0.1:54780, dest: /127.0.0.1:45925, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741906_1082, duration(ns): 395173
19:48:57.058 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741906_1082, type=LAST_IN_PIPELINE terminating
19:48:57.059 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.060 INFO StateChange - BLOCK* allocate blk_1073741907_1083, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/.part-r-00000.bai
19:48:57.060 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741907_1083 src: /127.0.0.1:54782 dest: /127.0.0.1:45925
19:48:57.061 INFO clienttrace - src: /127.0.0.1:54782, dest: /127.0.0.1:45925, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741907_1083, duration(ns): 428132
19:48:57.061 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741907_1083, type=LAST_IN_PIPELINE terminating
19:48:57.062 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.064 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0 dst=null perm=null proto=rpc
19:48:57.065 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0 dst=null perm=null proto=rpc
19:48:57.065 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/task_202507151948564495261580706126691_1185_r_000000 dst=null perm=null proto=rpc
19:48:57.065 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/_temporary/attempt_202507151948564495261580706126691_1185_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/task_202507151948564495261580706126691_1185_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:57.066 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948564495261580706126691_1185_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/task_202507151948564495261580706126691_1185_r_000000
19:48:57.066 INFO SparkHadoopMapRedUtil - attempt_202507151948564495261580706126691_1185_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:57.066 INFO Executor - Finished task 0.0 in stage 246.0 (TID 302). 1858 bytes result sent to driver
19:48:57.066 INFO TaskSetManager - Finished task 0.0 in stage 246.0 (TID 302) in 433 ms on localhost (executor driver) (1/1)
19:48:57.066 INFO TaskSchedulerImpl - Removed TaskSet 246.0, whose tasks have all completed, from pool
19:48:57.067 INFO DAGScheduler - ResultStage 246 (runJob at SparkHadoopWriter.scala:83) finished in 0.441 s
19:48:57.067 INFO DAGScheduler - Job 185 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:57.067 INFO TaskSchedulerImpl - Killing all running tasks in stage 246: Stage finished
19:48:57.067 INFO DAGScheduler - Job 185 finished: runJob at SparkHadoopWriter.scala:83, took 0.502366 s
19:48:57.067 INFO SparkHadoopWriter - Start to commit write Job job_202507151948564495261580706126691_1185.
19:48:57.067 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:57.068 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts dst=null perm=null proto=rpc
19:48:57.068 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/task_202507151948564495261580706126691_1185_r_000000 dst=null perm=null proto=rpc
19:48:57.068 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:57.069 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/task_202507151948564495261580706126691_1185_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.069 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:57.070 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/task_202507151948564495261580706126691_1185_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.070 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:57.070 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary/0/task_202507151948564495261580706126691_1185_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.071 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_temporary dst=null perm=null proto=rpc
19:48:57.071 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.072 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.072 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.spark-staging-1185 dst=null perm=null proto=rpc
19:48:57.073 INFO SparkHadoopWriter - Write Job job_202507151948564495261580706126691_1185 committed. Elapsed time: 5 ms.
19:48:57.073 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.074 INFO StateChange - BLOCK* allocate blk_1073741908_1084, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/header
19:48:57.075 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741908_1084 src: /127.0.0.1:54796 dest: /127.0.0.1:45925
19:48:57.076 INFO clienttrace - src: /127.0.0.1:54796, dest: /127.0.0.1:45925, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741908_1084, duration(ns): 361936
19:48:57.076 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741908_1084, type=LAST_IN_PIPELINE terminating
19:48:57.076 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.077 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.078 INFO StateChange - BLOCK* allocate blk_1073741909_1085, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/terminator
19:48:57.078 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741909_1085 src: /127.0.0.1:54800 dest: /127.0.0.1:45925
19:48:57.079 INFO clienttrace - src: /127.0.0.1:54800, dest: /127.0.0.1:45925, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741909_1085, duration(ns): 348159
19:48:57.079 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741909_1085, type=LAST_IN_PIPELINE terminating
19:48:57.080 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.080 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts dst=null perm=null proto=rpc
19:48:57.081 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.082 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.082 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam
19:48:57.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.083 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.084 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam done
19:48:57.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.084 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi
19:48:57.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts dst=null perm=null proto=rpc
19:48:57.085 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:57.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:57.087 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
19:48:57.087 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
19:48:57.088 INFO StateChange - BLOCK* allocate blk_1073741910_1086, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi
19:48:57.089 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741910_1086 src: /127.0.0.1:54808 dest: /127.0.0.1:45925
19:48:57.090 INFO clienttrace - src: /127.0.0.1:54808, dest: /127.0.0.1:45925, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741910_1086, duration(ns): 355606
19:48:57.090 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741910_1086, type=LAST_IN_PIPELINE terminating
19:48:57.090 INFO FSNamesystem - BLOCK* blk_1073741910_1086 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi
19:48:57.491 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.491 INFO IndexFileMerger - Done merging .sbi files
19:48:57.491 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/ to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai
19:48:57.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts dst=null perm=null proto=rpc
19:48:57.492 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:57.493 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:57.493 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:57.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
19:48:57.495 INFO StateChange - BLOCK* allocate blk_1073741911_1087, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai
19:48:57.495 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741911_1087 src: /127.0.0.1:47090 dest: /127.0.0.1:45925
19:48:57.496 INFO clienttrace - src: /127.0.0.1:47090, dest: /127.0.0.1:45925, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741911_1087, duration(ns): 390883
19:48:57.496 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741911_1087, type=LAST_IN_PIPELINE terminating
19:48:57.497 INFO FSNamesystem - BLOCK* blk_1073741911_1087 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai
19:48:57.897 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:57.897 INFO IndexFileMerger - Done merging .bai files
19:48:57.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.parts dst=null perm=null proto=rpc
19:48:57.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:57.914 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi dst=null perm=null proto=rpc
19:48:57.914 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi dst=null perm=null proto=rpc
19:48:57.914 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi dst=null perm=null proto=rpc
19:48:57.915 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
19:48:57.915 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.916 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.916 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.916 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.917 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:57.917 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:57.917 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:57.919 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
19:48:57.919 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:57.920 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:57.920 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
19:48:57.920 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi dst=null perm=null proto=rpc
19:48:57.921 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi dst=null perm=null proto=rpc
19:48:57.921 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.sbi dst=null perm=null proto=rpc
19:48:57.922 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
19:48:57.922 INFO MemoryStore - Block broadcast_497 stored as values in memory (estimated size 312.0 B, free 1916.2 MiB)
19:48:57.926 INFO MemoryStore - Block broadcast_497_piece0 stored as bytes in memory (estimated size 231.0 B, free 1916.2 MiB)
19:48:57.926 INFO BlockManagerInfo - Added broadcast_497_piece0 in memory on localhost:36125 (size: 231.0 B, free: 1919.2 MiB)
19:48:57.926 INFO BlockManagerInfo - Removed broadcast_492_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:48:57.926 INFO SparkContext - Created broadcast 497 from broadcast at BamSource.java:104
19:48:57.927 INFO BlockManagerInfo - Removed broadcast_490_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.4 MiB)
19:48:57.927 INFO BlockManagerInfo - Removed broadcast_495_piece0 on localhost:36125 in memory (size: 157.6 KiB, free: 1919.5 MiB)
19:48:57.928 INFO MemoryStore - Block broadcast_498 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
19:48:57.928 INFO BlockManagerInfo - Removed broadcast_489_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.7 MiB)
19:48:57.928 INFO BlockManagerInfo - Removed broadcast_488_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.7 MiB)
19:48:57.929 INFO BlockManagerInfo - Removed broadcast_494_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.7 MiB)
19:48:57.929 INFO BlockManagerInfo - Removed broadcast_493_piece0 on localhost:36125 in memory (size: 1890.0 B, free: 1919.7 MiB)
19:48:57.929 INFO BlockManagerInfo - Removed broadcast_482_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.7 MiB)
19:48:57.930 INFO BlockManagerInfo - Removed broadcast_485_piece0 on localhost:36125 in memory (size: 67.1 KiB, free: 1919.8 MiB)
19:48:57.930 INFO BlockManagerInfo - Removed broadcast_486_piece0 on localhost:36125 in memory (size: 233.0 B, free: 1919.8 MiB)
19:48:57.930 INFO BlockManagerInfo - Removed broadcast_496_piece0 on localhost:36125 in memory (size: 58.5 KiB, free: 1919.8 MiB)
19:48:57.931 INFO BlockManagerInfo - Removed broadcast_480_piece0 on localhost:36125 in memory (size: 50.3 KiB, free: 1919.9 MiB)
19:48:57.931 INFO BlockManagerInfo - Removed broadcast_487_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.9 MiB)
19:48:57.931 INFO BlockManagerInfo - Removed broadcast_483_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1920.0 MiB)
19:48:57.938 INFO MemoryStore - Block broadcast_498_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
19:48:57.938 INFO BlockManagerInfo - Added broadcast_498_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.9 MiB)
19:48:57.938 INFO SparkContext - Created broadcast 498 from newAPIHadoopFile at PathSplitSource.java:96
19:48:57.947 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.947 INFO FileInputFormat - Total input files to process : 1
19:48:57.948 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.962 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:57.962 INFO DAGScheduler - Got job 186 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:57.962 INFO DAGScheduler - Final stage: ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:57.962 INFO DAGScheduler - Parents of final stage: List()
19:48:57.962 INFO DAGScheduler - Missing parents: List()
19:48:57.963 INFO DAGScheduler - Submitting ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:57.969 INFO MemoryStore - Block broadcast_499 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
19:48:57.970 INFO MemoryStore - Block broadcast_499_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
19:48:57.970 INFO BlockManagerInfo - Added broadcast_499_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.8 MiB)
19:48:57.970 INFO SparkContext - Created broadcast 499 from broadcast at DAGScheduler.scala:1580
19:48:57.970 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:57.970 INFO TaskSchedulerImpl - Adding task set 247.0 with 1 tasks resource profile 0
19:48:57.971 INFO TaskSetManager - Starting task 0.0 in stage 247.0 (TID 303) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:57.971 INFO Executor - Running task 0.0 in stage 247.0 (TID 303)
19:48:57.987 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam:0+236517
19:48:57.988 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.988 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:57.989 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:57.989 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:57.989 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:57.991 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
19:48:57.992 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:57.992 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:57.993 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
19:48:57.994 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:57.996 INFO Executor - Finished task 0.0 in stage 247.0 (TID 303). 749470 bytes result sent to driver
19:48:57.997 INFO TaskSetManager - Finished task 0.0 in stage 247.0 (TID 303) in 26 ms on localhost (executor driver) (1/1)
19:48:57.997 INFO TaskSchedulerImpl - Removed TaskSet 247.0, whose tasks have all completed, from pool
19:48:57.997 INFO DAGScheduler - ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.034 s
19:48:57.997 INFO DAGScheduler - Job 186 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:57.997 INFO TaskSchedulerImpl - Killing all running tasks in stage 247: Stage finished
19:48:57.997 INFO DAGScheduler - Job 186 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.035290 s
19:48:58.009 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:58.010 INFO DAGScheduler - Got job 187 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:58.010 INFO DAGScheduler - Final stage: ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185)
19:48:58.010 INFO DAGScheduler - Parents of final stage: List()
19:48:58.010 INFO DAGScheduler - Missing parents: List()
19:48:58.010 INFO DAGScheduler - Submitting ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:58.026 INFO MemoryStore - Block broadcast_500 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
19:48:58.027 INFO MemoryStore - Block broadcast_500_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
19:48:58.027 INFO BlockManagerInfo - Added broadcast_500_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.7 MiB)
19:48:58.028 INFO SparkContext - Created broadcast 500 from broadcast at DAGScheduler.scala:1580
19:48:58.028 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:58.028 INFO TaskSchedulerImpl - Adding task set 248.0 with 1 tasks resource profile 0
19:48:58.028 INFO TaskSetManager - Starting task 0.0 in stage 248.0 (TID 304) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
19:48:58.029 INFO Executor - Running task 0.0 in stage 248.0 (TID 304)
19:48:58.057 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
19:48:58.064 INFO Executor - Finished task 0.0 in stage 248.0 (TID 304). 989 bytes result sent to driver
19:48:58.064 INFO TaskSetManager - Finished task 0.0 in stage 248.0 (TID 304) in 36 ms on localhost (executor driver) (1/1)
19:48:58.064 INFO TaskSchedulerImpl - Removed TaskSet 248.0, whose tasks have all completed, from pool
19:48:58.064 INFO DAGScheduler - ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.054 s
19:48:58.064 INFO DAGScheduler - Job 187 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:58.064 INFO TaskSchedulerImpl - Killing all running tasks in stage 248: Stage finished
19:48:58.064 INFO DAGScheduler - Job 187 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.054974 s
19:48:58.068 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:58.068 INFO DAGScheduler - Got job 188 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:58.068 INFO DAGScheduler - Final stage: ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185)
19:48:58.068 INFO DAGScheduler - Parents of final stage: List()
19:48:58.068 INFO DAGScheduler - Missing parents: List()
19:48:58.068 INFO DAGScheduler - Submitting ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:58.074 INFO MemoryStore - Block broadcast_501 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
19:48:58.074 INFO MemoryStore - Block broadcast_501_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
19:48:58.075 INFO BlockManagerInfo - Added broadcast_501_piece0 in memory on localhost:36125 (size: 54.6 KiB, free: 1919.6 MiB)
19:48:58.075 INFO SparkContext - Created broadcast 501 from broadcast at DAGScheduler.scala:1580
19:48:58.075 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:58.075 INFO TaskSchedulerImpl - Adding task set 249.0 with 1 tasks resource profile 0
19:48:58.075 INFO TaskSetManager - Starting task 0.0 in stage 249.0 (TID 305) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:48:58.076 INFO Executor - Running task 0.0 in stage 249.0 (TID 305)
19:48:58.086 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam:0+236517
19:48:58.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:58.087 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam dst=null perm=null proto=rpc
19:48:58.087 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:58.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:58.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_dcafeb20-3403-4809-ac29-e73ca8f676f9.bam.bai dst=null perm=null proto=rpc
19:48:58.089 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
19:48:58.090 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:58.090 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
19:48:58.092 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
19:48:58.093 INFO Executor - Finished task 0.0 in stage 249.0 (TID 305). 989 bytes result sent to driver
19:48:58.093 INFO TaskSetManager - Finished task 0.0 in stage 249.0 (TID 305) in 18 ms on localhost (executor driver) (1/1)
19:48:58.093 INFO TaskSchedulerImpl - Removed TaskSet 249.0, whose tasks have all completed, from pool
19:48:58.093 INFO DAGScheduler - ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.025 s
19:48:58.093 INFO DAGScheduler - Job 188 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:58.093 INFO TaskSchedulerImpl - Killing all running tasks in stage 249: Stage finished
19:48:58.093 INFO DAGScheduler - Job 188 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.025788 s
19:48:58.101 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:58.102 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:58.102 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:58.103 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:58.104 INFO MemoryStore - Block broadcast_502 stored as values in memory (estimated size 576.0 B, free 1918.4 MiB)
19:48:58.105 INFO MemoryStore - Block broadcast_502_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.4 MiB)
19:48:58.105 INFO BlockManagerInfo - Added broadcast_502_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.6 MiB)
19:48:58.105 INFO SparkContext - Created broadcast 502 from broadcast at CramSource.java:114
19:48:58.106 INFO MemoryStore - Block broadcast_503 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:48:58.112 INFO MemoryStore - Block broadcast_503_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:48:58.112 INFO BlockManagerInfo - Added broadcast_503_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.6 MiB)
19:48:58.112 INFO SparkContext - Created broadcast 503 from newAPIHadoopFile at PathSplitSource.java:96
19:48:58.127 INFO MemoryStore - Block broadcast_504 stored as values in memory (estimated size 576.0 B, free 1918.0 MiB)
19:48:58.127 INFO MemoryStore - Block broadcast_504_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.0 MiB)
19:48:58.127 INFO BlockManagerInfo - Added broadcast_504_piece0 in memory on localhost:36125 (size: 228.0 B, free: 1919.6 MiB)
19:48:58.127 INFO SparkContext - Created broadcast 504 from broadcast at CramSource.java:114
19:48:58.128 INFO MemoryStore - Block broadcast_505 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
19:48:58.134 INFO MemoryStore - Block broadcast_505_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
19:48:58.134 INFO BlockManagerInfo - Added broadcast_505_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.5 MiB)
19:48:58.134 INFO SparkContext - Created broadcast 505 from newAPIHadoopFile at PathSplitSource.java:96
19:48:58.148 INFO FileInputFormat - Total input files to process : 1
19:48:58.149 INFO MemoryStore - Block broadcast_506 stored as values in memory (estimated size 6.0 KiB, free 1917.7 MiB)
19:48:58.149 INFO MemoryStore - Block broadcast_506_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
19:48:58.150 INFO BlockManagerInfo - Added broadcast_506_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.5 MiB)
19:48:58.150 INFO SparkContext - Created broadcast 506 from broadcast at ReadsSparkSink.java:133
19:48:58.150 INFO MemoryStore - Block broadcast_507 stored as values in memory (estimated size 6.2 KiB, free 1917.7 MiB)
19:48:58.151 INFO MemoryStore - Block broadcast_507_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
19:48:58.151 INFO BlockManagerInfo - Added broadcast_507_piece0 in memory on localhost:36125 (size: 1473.0 B, free: 1919.5 MiB)
19:48:58.151 INFO SparkContext - Created broadcast 507 from broadcast at CramSink.java:76
19:48:58.153 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts dst=null perm=null proto=rpc
19:48:58.153 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:58.153 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:58.153 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:58.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:58.160 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:58.160 INFO DAGScheduler - Registering RDD 1203 (mapToPair at SparkUtils.java:161) as input to shuffle 50
19:48:58.160 INFO DAGScheduler - Got job 189 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:58.160 INFO DAGScheduler - Final stage: ResultStage 251 (runJob at SparkHadoopWriter.scala:83)
19:48:58.160 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 250)
19:48:58.161 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 250)
19:48:58.161 INFO DAGScheduler - Submitting ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:58.172 INFO MemoryStore - Block broadcast_508 stored as values in memory (estimated size 292.8 KiB, free 1917.4 MiB)
19:48:58.173 INFO MemoryStore - Block broadcast_508_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.3 MiB)
19:48:58.173 INFO BlockManagerInfo - Added broadcast_508_piece0 in memory on localhost:36125 (size: 107.3 KiB, free: 1919.4 MiB)
19:48:58.173 INFO SparkContext - Created broadcast 508 from broadcast at DAGScheduler.scala:1580
19:48:58.173 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:58.173 INFO TaskSchedulerImpl - Adding task set 250.0 with 1 tasks resource profile 0
19:48:58.174 INFO TaskSetManager - Starting task 0.0 in stage 250.0 (TID 306) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
19:48:58.174 INFO Executor - Running task 0.0 in stage 250.0 (TID 306)
19:48:58.194 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:58.208 INFO Executor - Finished task 0.0 in stage 250.0 (TID 306). 1234 bytes result sent to driver
19:48:58.209 INFO BlockManagerInfo - Removed broadcast_498_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:58.209 INFO TaskSetManager - Finished task 0.0 in stage 250.0 (TID 306) in 35 ms on localhost (executor driver) (1/1)
19:48:58.209 INFO TaskSchedulerImpl - Removed TaskSet 250.0, whose tasks have all completed, from pool
19:48:58.209 INFO DAGScheduler - ShuffleMapStage 250 (mapToPair at SparkUtils.java:161) finished in 0.048 s
19:48:58.209 INFO DAGScheduler - looking for newly runnable stages
19:48:58.209 INFO DAGScheduler - running: HashSet()
19:48:58.209 INFO DAGScheduler - waiting: HashSet(ResultStage 251)
19:48:58.209 INFO DAGScheduler - failed: HashSet()
19:48:58.209 INFO DAGScheduler - Submitting ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89), which has no missing parents
19:48:58.209 INFO BlockManagerInfo - Removed broadcast_505_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:58.210 INFO BlockManagerInfo - Removed broadcast_491_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.6 MiB)
19:48:58.210 INFO BlockManagerInfo - Removed broadcast_497_piece0 on localhost:36125 in memory (size: 231.0 B, free: 1919.6 MiB)
19:48:58.210 INFO BlockManagerInfo - Removed broadcast_501_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.6 MiB)
19:48:58.211 INFO BlockManagerInfo - Removed broadcast_504_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.6 MiB)
19:48:58.211 INFO BlockManagerInfo - Removed broadcast_500_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.8 MiB)
19:48:58.211 INFO BlockManagerInfo - Removed broadcast_499_piece0 on localhost:36125 in memory (size: 54.6 KiB, free: 1919.8 MiB)
19:48:58.216 INFO MemoryStore - Block broadcast_509 stored as values in memory (estimated size 153.3 KiB, free 1919.1 MiB)
19:48:58.217 INFO MemoryStore - Block broadcast_509_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1919.0 MiB)
19:48:58.217 INFO BlockManagerInfo - Added broadcast_509_piece0 in memory on localhost:36125 (size: 58.1 KiB, free: 1919.8 MiB)
19:48:58.217 INFO SparkContext - Created broadcast 509 from broadcast at DAGScheduler.scala:1580
19:48:58.217 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
19:48:58.217 INFO TaskSchedulerImpl - Adding task set 251.0 with 1 tasks resource profile 0
19:48:58.218 INFO TaskSetManager - Starting task 0.0 in stage 251.0 (TID 307) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:58.218 INFO Executor - Running task 0.0 in stage 251.0 (TID 307)
19:48:58.224 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:58.225 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:58.234 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:58.234 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:58.234 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:58.234 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:48:58.234 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:58.234 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:58.235 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/_temporary/attempt_202507151948586347444616967096553_1208_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:58.263 INFO StateChange - BLOCK* allocate blk_1073741912_1088, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/_temporary/attempt_202507151948586347444616967096553_1208_r_000000_0/part-r-00000
19:48:58.265 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741912_1088 src: /127.0.0.1:47100 dest: /127.0.0.1:45925
19:48:58.266 INFO clienttrace - src: /127.0.0.1:47100, dest: /127.0.0.1:45925, bytes: 42659, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741912_1088, duration(ns): 480761
19:48:58.266 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741912_1088, type=LAST_IN_PIPELINE terminating
19:48:58.267 INFO FSNamesystem - BLOCK* blk_1073741912_1088 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/_temporary/attempt_202507151948586347444616967096553_1208_r_000000_0/part-r-00000
19:48:58.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741906_1082 replica FinalizedReplica, blk_1073741906_1082, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741906 for deletion
19:48:58.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741907_1083 replica FinalizedReplica, blk_1073741907_1083, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741907 for deletion
19:48:58.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741899_1075 replica FinalizedReplica, blk_1073741899_1075, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data1
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741899 for deletion
19:48:58.414 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741906_1082 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741906
19:48:58.414 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741900_1076 replica FinalizedReplica, blk_1073741900_1076, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10689261495343833868/data/data2
getBlockURI() = file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741900 for deletion
19:48:58.414 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741907_1083 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741907
19:48:58.414 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741900_1076 URI file:/tmp/minicluster_storage10689261495343833868/data/data2/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741900
19:48:58.415 INFO FsDatasetAsyncDiskService - Deleted BP-1160364076-10.1.0.127-1752608895182 blk_1073741899_1075 URI file:/tmp/minicluster_storage10689261495343833868/data/data1/current/BP-1160364076-10.1.0.127-1752608895182/current/finalized/subdir0/subdir0/blk_1073741899
19:48:58.667 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/_temporary/attempt_202507151948586347444616967096553_1208_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:58.668 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/_temporary/attempt_202507151948586347444616967096553_1208_r_000000_0 dst=null perm=null proto=rpc
19:48:58.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/_temporary/attempt_202507151948586347444616967096553_1208_r_000000_0 dst=null perm=null proto=rpc
19:48:58.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/task_202507151948586347444616967096553_1208_r_000000 dst=null perm=null proto=rpc
19:48:58.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/_temporary/attempt_202507151948586347444616967096553_1208_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/task_202507151948586347444616967096553_1208_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:58.670 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948586347444616967096553_1208_r_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/task_202507151948586347444616967096553_1208_r_000000
19:48:58.670 INFO SparkHadoopMapRedUtil - attempt_202507151948586347444616967096553_1208_r_000000_0: Committed. Elapsed time: 1 ms.
19:48:58.670 INFO Executor - Finished task 0.0 in stage 251.0 (TID 307). 1858 bytes result sent to driver
19:48:58.670 INFO TaskSetManager - Finished task 0.0 in stage 251.0 (TID 307) in 452 ms on localhost (executor driver) (1/1)
19:48:58.670 INFO TaskSchedulerImpl - Removed TaskSet 251.0, whose tasks have all completed, from pool
19:48:58.671 INFO DAGScheduler - ResultStage 251 (runJob at SparkHadoopWriter.scala:83) finished in 0.462 s
19:48:58.671 INFO DAGScheduler - Job 189 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:58.671 INFO TaskSchedulerImpl - Killing all running tasks in stage 251: Stage finished
19:48:58.671 INFO DAGScheduler - Job 189 finished: runJob at SparkHadoopWriter.scala:83, took 0.510920 s
19:48:58.671 INFO SparkHadoopWriter - Start to commit write Job job_202507151948586347444616967096553_1208.
19:48:58.671 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0 dst=null perm=null proto=rpc
19:48:58.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts dst=null perm=null proto=rpc
19:48:58.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/task_202507151948586347444616967096553_1208_r_000000 dst=null perm=null proto=rpc
19:48:58.673 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/part-r-00000 dst=null perm=null proto=rpc
19:48:58.673 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary/0/task_202507151948586347444616967096553_1208_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:58.674 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_temporary dst=null perm=null proto=rpc
19:48:58.674 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:58.675 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:58.676 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/.spark-staging-1208 dst=null perm=null proto=rpc
19:48:58.676 INFO SparkHadoopWriter - Write Job job_202507151948586347444616967096553_1208 committed. Elapsed time: 4 ms.
19:48:58.676 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:58.678 INFO StateChange - BLOCK* allocate blk_1073741913_1089, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/header
19:48:58.679 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741913_1089 src: /127.0.0.1:47114 dest: /127.0.0.1:45925
19:48:58.679 INFO clienttrace - src: /127.0.0.1:47114, dest: /127.0.0.1:45925, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741913_1089, duration(ns): 361718
19:48:58.680 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741913_1089, type=LAST_IN_PIPELINE terminating
19:48:58.680 INFO FSNamesystem - BLOCK* blk_1073741913_1089 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/header
19:48:59.081 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:59.081 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:59.082 INFO StateChange - BLOCK* allocate blk_1073741914_1090, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/terminator
19:48:59.083 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741914_1090 src: /127.0.0.1:47124 dest: /127.0.0.1:45925
19:48:59.084 INFO clienttrace - src: /127.0.0.1:47124, dest: /127.0.0.1:45925, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741914_1090, duration(ns): 316406
19:48:59.084 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741914_1090, type=LAST_IN_PIPELINE terminating
19:48:59.084 INFO FSNamesystem - BLOCK* blk_1073741914_1090 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/terminator
19:48:59.485 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:59.485 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts dst=null perm=null proto=rpc
19:48:59.486 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:59.486 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:59.486 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram
19:48:59.487 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:59.487 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.488 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.488 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:59.488 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram done
19:48:59.489 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.parts dst=null perm=null proto=rpc
19:48:59.489 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.489 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.490 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.490 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.crai dst=null perm=null proto=rpc
19:48:59.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.crai dst=null perm=null proto=rpc
19:48:59.493 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
19:48:59.493 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:59.493 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
19:48:59.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.crai dst=null perm=null proto=rpc
19:48:59.495 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.crai dst=null perm=null proto=rpc
19:48:59.495 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.495 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.496 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
19:48:59.497 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:59.497 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
19:48:59.497 INFO MemoryStore - Block broadcast_510 stored as values in memory (estimated size 528.0 B, free 1919.0 MiB)
19:48:59.498 INFO MemoryStore - Block broadcast_510_piece0 stored as bytes in memory (estimated size 187.0 B, free 1919.0 MiB)
19:48:59.498 INFO BlockManagerInfo - Added broadcast_510_piece0 in memory on localhost:36125 (size: 187.0 B, free: 1919.8 MiB)
19:48:59.498 INFO SparkContext - Created broadcast 510 from broadcast at CramSource.java:114
19:48:59.499 INFO MemoryStore - Block broadcast_511 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
19:48:59.505 INFO MemoryStore - Block broadcast_511_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
19:48:59.506 INFO BlockManagerInfo - Added broadcast_511_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:48:59.506 INFO SparkContext - Created broadcast 511 from newAPIHadoopFile at PathSplitSource.java:96
19:48:59.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.520 INFO FileInputFormat - Total input files to process : 1
19:48:59.520 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.545 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:48:59.546 INFO DAGScheduler - Got job 190 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:48:59.546 INFO DAGScheduler - Final stage: ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182)
19:48:59.546 INFO DAGScheduler - Parents of final stage: List()
19:48:59.546 INFO DAGScheduler - Missing parents: List()
19:48:59.546 INFO DAGScheduler - Submitting ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:59.557 INFO MemoryStore - Block broadcast_512 stored as values in memory (estimated size 286.8 KiB, free 1918.4 MiB)
19:48:59.558 INFO MemoryStore - Block broadcast_512_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.3 MiB)
19:48:59.558 INFO BlockManagerInfo - Added broadcast_512_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.6 MiB)
19:48:59.558 INFO SparkContext - Created broadcast 512 from broadcast at DAGScheduler.scala:1580
19:48:59.558 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:59.558 INFO TaskSchedulerImpl - Adding task set 252.0 with 1 tasks resource profile 0
19:48:59.559 INFO TaskSetManager - Starting task 0.0 in stage 252.0 (TID 308) (localhost, executor driver, partition 0, ANY, 7853 bytes)
19:48:59.559 INFO Executor - Running task 0.0 in stage 252.0 (TID 308)
19:48:59.579 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram:0+43713
19:48:59.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.crai dst=null perm=null proto=rpc
19:48:59.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.crai dst=null perm=null proto=rpc
19:48:59.583 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
19:48:59.583 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:59.584 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
19:48:59.592 INFO Executor - Finished task 0.0 in stage 252.0 (TID 308). 154058 bytes result sent to driver
19:48:59.593 INFO TaskSetManager - Finished task 0.0 in stage 252.0 (TID 308) in 34 ms on localhost (executor driver) (1/1)
19:48:59.593 INFO TaskSchedulerImpl - Removed TaskSet 252.0, whose tasks have all completed, from pool
19:48:59.593 INFO DAGScheduler - ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.047 s
19:48:59.593 INFO DAGScheduler - Job 190 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:59.593 INFO TaskSchedulerImpl - Killing all running tasks in stage 252: Stage finished
19:48:59.593 INFO DAGScheduler - Job 190 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.047997 s
19:48:59.601 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:59.601 INFO DAGScheduler - Got job 191 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:59.601 INFO DAGScheduler - Final stage: ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185)
19:48:59.601 INFO DAGScheduler - Parents of final stage: List()
19:48:59.601 INFO DAGScheduler - Missing parents: List()
19:48:59.601 INFO DAGScheduler - Submitting ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:59.612 INFO MemoryStore - Block broadcast_513 stored as values in memory (estimated size 286.8 KiB, free 1918.0 MiB)
19:48:59.613 INFO MemoryStore - Block broadcast_513_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.9 MiB)
19:48:59.614 INFO BlockManagerInfo - Added broadcast_513_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.5 MiB)
19:48:59.614 INFO SparkContext - Created broadcast 513 from broadcast at DAGScheduler.scala:1580
19:48:59.614 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:59.614 INFO TaskSchedulerImpl - Adding task set 253.0 with 1 tasks resource profile 0
19:48:59.614 INFO TaskSetManager - Starting task 0.0 in stage 253.0 (TID 309) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
19:48:59.615 INFO Executor - Running task 0.0 in stage 253.0 (TID 309)
19:48:59.635 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
19:48:59.640 INFO Executor - Finished task 0.0 in stage 253.0 (TID 309). 989 bytes result sent to driver
19:48:59.640 INFO TaskSetManager - Finished task 0.0 in stage 253.0 (TID 309) in 26 ms on localhost (executor driver) (1/1)
19:48:59.640 INFO TaskSchedulerImpl - Removed TaskSet 253.0, whose tasks have all completed, from pool
19:48:59.641 INFO DAGScheduler - ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.040 s
19:48:59.641 INFO DAGScheduler - Job 191 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:59.641 INFO TaskSchedulerImpl - Killing all running tasks in stage 253: Stage finished
19:48:59.641 INFO DAGScheduler - Job 191 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.040091 s
19:48:59.644 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:48:59.644 INFO DAGScheduler - Got job 192 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:48:59.644 INFO DAGScheduler - Final stage: ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185)
19:48:59.644 INFO DAGScheduler - Parents of final stage: List()
19:48:59.644 INFO DAGScheduler - Missing parents: List()
19:48:59.644 INFO DAGScheduler - Submitting ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
19:48:59.656 INFO MemoryStore - Block broadcast_514 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
19:48:59.657 INFO MemoryStore - Block broadcast_514_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
19:48:59.657 INFO BlockManagerInfo - Added broadcast_514_piece0 in memory on localhost:36125 (size: 103.6 KiB, free: 1919.4 MiB)
19:48:59.657 INFO SparkContext - Created broadcast 514 from broadcast at DAGScheduler.scala:1580
19:48:59.657 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:48:59.657 INFO TaskSchedulerImpl - Adding task set 254.0 with 1 tasks resource profile 0
19:48:59.657 INFO TaskSetManager - Starting task 0.0 in stage 254.0 (TID 310) (localhost, executor driver, partition 0, ANY, 7853 bytes)
19:48:59.658 INFO Executor - Running task 0.0 in stage 254.0 (TID 310)
19:48:59.677 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram:0+43713
19:48:59.678 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.678 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram dst=null perm=null proto=rpc
19:48:59.679 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.cram.crai dst=null perm=null proto=rpc
19:48:59.679 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_bd644073-7906-4722-8f7d-b43478572936.crai dst=null perm=null proto=rpc
19:48:59.681 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
19:48:59.681 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
19:48:59.694 INFO Executor - Finished task 0.0 in stage 254.0 (TID 310). 989 bytes result sent to driver
19:48:59.694 INFO TaskSetManager - Finished task 0.0 in stage 254.0 (TID 310) in 37 ms on localhost (executor driver) (1/1)
19:48:59.694 INFO TaskSchedulerImpl - Removed TaskSet 254.0, whose tasks have all completed, from pool
19:48:59.694 INFO DAGScheduler - ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.049 s
19:48:59.694 INFO DAGScheduler - Job 192 is finished. Cancelling potential speculative or zombie tasks for this job
19:48:59.694 INFO TaskSchedulerImpl - Killing all running tasks in stage 254: Stage finished
19:48:59.694 INFO DAGScheduler - Job 192 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.050185 s
19:48:59.703 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:48:59.703 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:59.704 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:48:59.705 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:48:59.707 INFO MemoryStore - Block broadcast_515 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
19:48:59.718 INFO MemoryStore - Block broadcast_515_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.2 MiB)
19:48:59.718 INFO BlockManagerInfo - Added broadcast_515_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:48:59.718 INFO SparkContext - Created broadcast 515 from newAPIHadoopFile at PathSplitSource.java:96
19:48:59.744 INFO MemoryStore - Block broadcast_516 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
19:48:59.750 INFO MemoryStore - Block broadcast_516_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
19:48:59.750 INFO BlockManagerInfo - Added broadcast_516_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.3 MiB)
19:48:59.750 INFO SparkContext - Created broadcast 516 from newAPIHadoopFile at PathSplitSource.java:96
19:48:59.770 INFO FileInputFormat - Total input files to process : 1
19:48:59.772 INFO MemoryStore - Block broadcast_517 stored as values in memory (estimated size 160.7 KiB, free 1916.7 MiB)
19:48:59.772 INFO MemoryStore - Block broadcast_517_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.7 MiB)
19:48:59.773 INFO BlockManagerInfo - Added broadcast_517_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.3 MiB)
19:48:59.773 INFO SparkContext - Created broadcast 517 from broadcast at ReadsSparkSink.java:133
19:48:59.776 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts dst=null perm=null proto=rpc
19:48:59.777 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:59.777 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:59.777 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:59.777 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
19:48:59.783 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:48:59.784 INFO DAGScheduler - Registering RDD 1228 (mapToPair at SparkUtils.java:161) as input to shuffle 51
19:48:59.784 INFO DAGScheduler - Got job 193 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:48:59.784 INFO DAGScheduler - Final stage: ResultStage 256 (runJob at SparkHadoopWriter.scala:83)
19:48:59.784 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 255)
19:48:59.784 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 255)
19:48:59.784 INFO DAGScheduler - Submitting ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161), which has no missing parents
19:48:59.801 INFO MemoryStore - Block broadcast_518 stored as values in memory (estimated size 520.4 KiB, free 1916.2 MiB)
19:48:59.802 INFO MemoryStore - Block broadcast_518_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
19:48:59.802 INFO BlockManagerInfo - Added broadcast_518_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:48:59.802 INFO SparkContext - Created broadcast 518 from broadcast at DAGScheduler.scala:1580
19:48:59.803 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:48:59.803 INFO TaskSchedulerImpl - Adding task set 255.0 with 1 tasks resource profile 0
19:48:59.803 INFO TaskSetManager - Starting task 0.0 in stage 255.0 (TID 311) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:48:59.803 INFO Executor - Running task 0.0 in stage 255.0 (TID 311)
19:48:59.833 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:48:59.847 INFO Executor - Finished task 0.0 in stage 255.0 (TID 311). 1148 bytes result sent to driver
19:48:59.847 INFO TaskSetManager - Finished task 0.0 in stage 255.0 (TID 311) in 44 ms on localhost (executor driver) (1/1)
19:48:59.847 INFO TaskSchedulerImpl - Removed TaskSet 255.0, whose tasks have all completed, from pool
19:48:59.847 INFO DAGScheduler - ShuffleMapStage 255 (mapToPair at SparkUtils.java:161) finished in 0.063 s
19:48:59.848 INFO DAGScheduler - looking for newly runnable stages
19:48:59.848 INFO DAGScheduler - running: HashSet()
19:48:59.848 INFO DAGScheduler - waiting: HashSet(ResultStage 256)
19:48:59.848 INFO DAGScheduler - failed: HashSet()
19:48:59.848 INFO DAGScheduler - Submitting ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65), which has no missing parents
19:48:59.854 INFO MemoryStore - Block broadcast_519 stored as values in memory (estimated size 241.1 KiB, free 1915.8 MiB)
19:48:59.859 INFO BlockManagerInfo - Removed broadcast_509_piece0 on localhost:36125 in memory (size: 58.1 KiB, free: 1919.2 MiB)
19:48:59.859 INFO MemoryStore - Block broadcast_519_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.0 MiB)
19:48:59.859 INFO BlockManagerInfo - Added broadcast_519_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.2 MiB)
19:48:59.859 INFO SparkContext - Created broadcast 519 from broadcast at DAGScheduler.scala:1580
19:48:59.859 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
19:48:59.859 INFO BlockManagerInfo - Removed broadcast_510_piece0 on localhost:36125 in memory (size: 187.0 B, free: 1919.2 MiB)
19:48:59.859 INFO TaskSchedulerImpl - Adding task set 256.0 with 1 tasks resource profile 0
19:48:59.860 INFO BlockManagerInfo - Removed broadcast_503_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.2 MiB)
19:48:59.860 INFO BlockManagerInfo - Removed broadcast_514_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.3 MiB)
19:48:59.861 INFO BlockManagerInfo - Removed broadcast_516_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.4 MiB)
19:48:59.861 INFO TaskSetManager - Starting task 0.0 in stage 256.0 (TID 312) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:48:59.861 INFO Executor - Running task 0.0 in stage 256.0 (TID 312)
19:48:59.861 INFO BlockManagerInfo - Removed broadcast_507_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.4 MiB)
19:48:59.862 INFO BlockManagerInfo - Removed broadcast_512_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.5 MiB)
19:48:59.863 INFO BlockManagerInfo - Removed broadcast_502_piece0 on localhost:36125 in memory (size: 228.0 B, free: 1919.5 MiB)
19:48:59.863 INFO BlockManagerInfo - Removed broadcast_506_piece0 on localhost:36125 in memory (size: 1473.0 B, free: 1919.5 MiB)
19:48:59.864 INFO BlockManagerInfo - Removed broadcast_511_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.5 MiB)
19:48:59.865 INFO BlockManagerInfo - Removed broadcast_508_piece0 on localhost:36125 in memory (size: 107.3 KiB, free: 1919.6 MiB)
19:48:59.865 INFO BlockManagerInfo - Removed broadcast_513_piece0 on localhost:36125 in memory (size: 103.6 KiB, free: 1919.7 MiB)
19:48:59.867 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:48:59.867 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:48:59.882 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19:48:59.882 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:48:59.882 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:48:59.883 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/_temporary/attempt_202507151948592746273548793779547_1234_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:48:59.885 INFO StateChange - BLOCK* allocate blk_1073741915_1091, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/_temporary/attempt_202507151948592746273548793779547_1234_m_000000_0/part-00000
19:48:59.886 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741915_1091 src: /127.0.0.1:47136 dest: /127.0.0.1:45925
19:48:59.891 INFO clienttrace - src: /127.0.0.1:47136, dest: /127.0.0.1:45925, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741915_1091, duration(ns): 4815777
19:48:59.891 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741915_1091, type=LAST_IN_PIPELINE terminating
19:48:59.892 INFO FSNamesystem - BLOCK* blk_1073741915_1091 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/_temporary/attempt_202507151948592746273548793779547_1234_m_000000_0/part-00000
19:49:00.293 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/_temporary/attempt_202507151948592746273548793779547_1234_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:49:00.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/_temporary/attempt_202507151948592746273548793779547_1234_m_000000_0 dst=null perm=null proto=rpc
19:49:00.294 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/_temporary/attempt_202507151948592746273548793779547_1234_m_000000_0 dst=null perm=null proto=rpc
19:49:00.294 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/task_202507151948592746273548793779547_1234_m_000000 dst=null perm=null proto=rpc
19:49:00.295 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/_temporary/attempt_202507151948592746273548793779547_1234_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/task_202507151948592746273548793779547_1234_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
19:49:00.295 INFO FileOutputCommitter - Saved output of task 'attempt_202507151948592746273548793779547_1234_m_000000_0' to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/task_202507151948592746273548793779547_1234_m_000000
19:49:00.295 INFO SparkHadoopMapRedUtil - attempt_202507151948592746273548793779547_1234_m_000000_0: Committed. Elapsed time: 1 ms.
19:49:00.295 INFO Executor - Finished task 0.0 in stage 256.0 (TID 312). 1858 bytes result sent to driver
19:49:00.296 INFO TaskSetManager - Finished task 0.0 in stage 256.0 (TID 312) in 435 ms on localhost (executor driver) (1/1)
19:49:00.296 INFO TaskSchedulerImpl - Removed TaskSet 256.0, whose tasks have all completed, from pool
19:49:00.296 INFO DAGScheduler - ResultStage 256 (runJob at SparkHadoopWriter.scala:83) finished in 0.448 s
19:49:00.296 INFO DAGScheduler - Job 193 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:00.296 INFO TaskSchedulerImpl - Killing all running tasks in stage 256: Stage finished
19:49:00.296 INFO DAGScheduler - Job 193 finished: runJob at SparkHadoopWriter.scala:83, took 0.512581 s
19:49:00.296 INFO SparkHadoopWriter - Start to commit write Job job_202507151948592746273548793779547_1234.
19:49:00.297 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0 dst=null perm=null proto=rpc
19:49:00.297 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts dst=null perm=null proto=rpc
19:49:00.298 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/task_202507151948592746273548793779547_1234_m_000000 dst=null perm=null proto=rpc
19:49:00.298 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/part-00000 dst=null perm=null proto=rpc
19:49:00.298 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary/0/task_202507151948592746273548793779547_1234_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
19:49:00.299 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_temporary dst=null perm=null proto=rpc
19:49:00.300 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:49:00.300 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:49:00.301 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/.spark-staging-1234 dst=null perm=null proto=rpc
19:49:00.301 INFO SparkHadoopWriter - Write Job job_202507151948592746273548793779547_1234 committed. Elapsed time: 4 ms.
19:49:00.301 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:49:00.302 INFO StateChange - BLOCK* allocate blk_1073741916_1092, replicas=127.0.0.1:45925 for /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/header
19:49:00.303 INFO DataNode - Receiving BP-1160364076-10.1.0.127-1752608895182:blk_1073741916_1092 src: /127.0.0.1:47138 dest: /127.0.0.1:45925
19:49:00.304 INFO clienttrace - src: /127.0.0.1:47138, dest: /127.0.0.1:45925, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-194223981_1, offset: 0, srvID: bb3d0727-ed5c-43d0-8c99-8dbd6a2fcf0e, blockid: BP-1160364076-10.1.0.127-1752608895182:blk_1073741916_1092, duration(ns): 582422
19:49:00.304 INFO DataNode - PacketResponder: BP-1160364076-10.1.0.127-1752608895182:blk_1073741916_1092, type=LAST_IN_PIPELINE terminating
19:49:00.305 INFO FSNamesystem - BLOCK* blk_1073741916_1092 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/header
19:49:00.705 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:49:00.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts dst=null perm=null proto=rpc
19:49:00.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
19:49:00.707 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-194223981_1
19:49:00.707 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam
19:49:00.707 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
19:49:00.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.709 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam perm=runner:supergroup:rw-r--r-- proto=rpc
19:49:00.709 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam done
19:49:00.709 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam.parts dst=null perm=null proto=rpc
19:49:00.709 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.710 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.710 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.710 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
WARNING 2025-07-15 19:49:00 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:49:00.712 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
19:49:00.713 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:49:00.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
WARNING 2025-07-15 19:49:00 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
19:49:00.715 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
19:49:00.715 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:49:00.716 INFO MemoryStore - Block broadcast_520 stored as values in memory (estimated size 160.7 KiB, free 1918.4 MiB)
19:49:00.717 INFO MemoryStore - Block broadcast_520_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.4 MiB)
19:49:00.717 INFO BlockManagerInfo - Added broadcast_520_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.7 MiB)
19:49:00.717 INFO SparkContext - Created broadcast 520 from broadcast at SamSource.java:78
19:49:00.718 INFO MemoryStore - Block broadcast_521 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:49:00.724 INFO MemoryStore - Block broadcast_521_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:49:00.724 INFO BlockManagerInfo - Added broadcast_521_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:49:00.724 INFO SparkContext - Created broadcast 521 from newAPIHadoopFile at SamSource.java:108
19:49:00.727 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.727 INFO FileInputFormat - Total input files to process : 1
19:49:00.727 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.731 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:49:00.732 INFO DAGScheduler - Got job 194 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:49:00.732 INFO DAGScheduler - Final stage: ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182)
19:49:00.732 INFO DAGScheduler - Parents of final stage: List()
19:49:00.732 INFO DAGScheduler - Missing parents: List()
19:49:00.732 INFO DAGScheduler - Submitting ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
19:49:00.732 INFO MemoryStore - Block broadcast_522 stored as values in memory (estimated size 7.5 KiB, free 1918.0 MiB)
19:49:00.733 INFO MemoryStore - Block broadcast_522_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.0 MiB)
19:49:00.733 INFO BlockManagerInfo - Added broadcast_522_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.7 MiB)
19:49:00.733 INFO SparkContext - Created broadcast 522 from broadcast at DAGScheduler.scala:1580
19:49:00.733 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:49:00.733 INFO TaskSchedulerImpl - Adding task set 257.0 with 1 tasks resource profile 0
19:49:00.733 INFO TaskSetManager - Starting task 0.0 in stage 257.0 (TID 313) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:49:00.734 INFO Executor - Running task 0.0 in stage 257.0 (TID 313)
19:49:00.735 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam:0+847558
19:49:00.736 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.737 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
19:49:00.746 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:49:00.752 INFO Executor - Finished task 0.0 in stage 257.0 (TID 313). 651483 bytes result sent to driver
19:49:00.753 INFO TaskSetManager - Finished task 0.0 in stage 257.0 (TID 313) in 20 ms on localhost (executor driver) (1/1)
19:49:00.753 INFO TaskSchedulerImpl - Removed TaskSet 257.0, whose tasks have all completed, from pool
19:49:00.753 INFO DAGScheduler - ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.021 s
19:49:00.754 INFO DAGScheduler - Job 194 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:00.754 INFO TaskSchedulerImpl - Killing all running tasks in stage 257: Stage finished
19:49:00.754 INFO DAGScheduler - Job 194 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.022347 s
19:49:00.763 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:49:00.763 INFO DAGScheduler - Got job 195 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:49:00.763 INFO DAGScheduler - Final stage: ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185)
19:49:00.763 INFO DAGScheduler - Parents of final stage: List()
19:49:00.763 INFO DAGScheduler - Missing parents: List()
19:49:00.763 INFO DAGScheduler - Submitting ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96), which has no missing parents
19:49:00.780 INFO MemoryStore - Block broadcast_523 stored as values in memory (estimated size 426.1 KiB, free 1917.6 MiB)
19:49:00.781 INFO MemoryStore - Block broadcast_523_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
19:49:00.781 INFO BlockManagerInfo - Added broadcast_523_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:49:00.781 INFO SparkContext - Created broadcast 523 from broadcast at DAGScheduler.scala:1580
19:49:00.782 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:49:00.782 INFO TaskSchedulerImpl - Adding task set 258.0 with 1 tasks resource profile 0
19:49:00.782 INFO TaskSetManager - Starting task 0.0 in stage 258.0 (TID 314) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:49:00.782 INFO Executor - Running task 0.0 in stage 258.0 (TID 314)
19:49:00.811 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:49:00.820 INFO Executor - Finished task 0.0 in stage 258.0 (TID 314). 989 bytes result sent to driver
19:49:00.821 INFO TaskSetManager - Finished task 0.0 in stage 258.0 (TID 314) in 39 ms on localhost (executor driver) (1/1)
19:49:00.821 INFO TaskSchedulerImpl - Removed TaskSet 258.0, whose tasks have all completed, from pool
19:49:00.821 INFO DAGScheduler - ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:49:00.821 INFO DAGScheduler - Job 195 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:00.821 INFO TaskSchedulerImpl - Killing all running tasks in stage 258: Stage finished
19:49:00.821 INFO DAGScheduler - Job 195 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058237 s
19:49:00.825 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:49:00.825 INFO DAGScheduler - Got job 196 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:49:00.825 INFO DAGScheduler - Final stage: ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185)
19:49:00.825 INFO DAGScheduler - Parents of final stage: List()
19:49:00.825 INFO DAGScheduler - Missing parents: List()
19:49:00.825 INFO DAGScheduler - Submitting ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
19:49:00.825 INFO MemoryStore - Block broadcast_524 stored as values in memory (estimated size 7.4 KiB, free 1917.4 MiB)
19:49:00.826 INFO MemoryStore - Block broadcast_524_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1917.4 MiB)
19:49:00.826 INFO BlockManagerInfo - Added broadcast_524_piece0 in memory on localhost:36125 (size: 3.8 KiB, free: 1919.5 MiB)
19:49:00.826 INFO SparkContext - Created broadcast 524 from broadcast at DAGScheduler.scala:1580
19:49:00.826 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:49:00.826 INFO TaskSchedulerImpl - Adding task set 259.0 with 1 tasks resource profile 0
19:49:00.827 INFO TaskSetManager - Starting task 0.0 in stage 259.0 (TID 315) (localhost, executor driver, partition 0, ANY, 7852 bytes)
19:49:00.827 INFO Executor - Running task 0.0 in stage 259.0 (TID 315)
19:49:00.828 INFO NewHadoopRDD - Input split: hdfs://localhost:41235/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam:0+847558
19:49:00.829 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_b30a5d4c-a87e-45c8-912d-6f5ffc3ab6cd.sam dst=null perm=null proto=rpc
19:49:00.830 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
19:49:00.835 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
19:49:00.836 INFO Executor - Finished task 0.0 in stage 259.0 (TID 315). 946 bytes result sent to driver
19:49:00.836 INFO TaskSetManager - Finished task 0.0 in stage 259.0 (TID 315) in 10 ms on localhost (executor driver) (1/1)
19:49:00.836 INFO TaskSchedulerImpl - Removed TaskSet 259.0, whose tasks have all completed, from pool
19:49:00.836 INFO DAGScheduler - ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.011 s
19:49:00.836 INFO DAGScheduler - Job 196 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:00.836 INFO TaskSchedulerImpl - Killing all running tasks in stage 259: Stage finished
19:49:00.836 INFO DAGScheduler - Job 196 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.011475 s
19:49:00.839 INFO MemoryStore - Block broadcast_525 stored as values in memory (estimated size 297.9 KiB, free 1917.1 MiB)
19:49:00.845 INFO MemoryStore - Block broadcast_525_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.1 MiB)
19:49:00.845 INFO BlockManagerInfo - Added broadcast_525_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:49:00.845 INFO SparkContext - Created broadcast 525 from newAPIHadoopFile at PathSplitSource.java:96
19:49:00.866 INFO MemoryStore - Block broadcast_526 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
19:49:00.872 INFO MemoryStore - Block broadcast_526_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
19:49:00.872 INFO BlockManagerInfo - Added broadcast_526_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.4 MiB)
19:49:00.873 INFO SparkContext - Created broadcast 526 from newAPIHadoopFile at PathSplitSource.java:96
19:49:00.892 INFO FileInputFormat - Total input files to process : 1
19:49:00.894 INFO MemoryStore - Block broadcast_527 stored as values in memory (estimated size 160.7 KiB, free 1916.6 MiB)
19:49:00.894 INFO MemoryStore - Block broadcast_527_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
19:49:00.894 INFO BlockManagerInfo - Added broadcast_527_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:49:00.894 INFO SparkContext - Created broadcast 527 from broadcast at ReadsSparkSink.java:133
19:49:00.896 INFO MemoryStore - Block broadcast_528 stored as values in memory (estimated size 163.2 KiB, free 1916.4 MiB)
19:49:00.896 INFO MemoryStore - Block broadcast_528_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.4 MiB)
19:49:00.896 INFO BlockManagerInfo - Added broadcast_528_piece0 in memory on localhost:36125 (size: 9.6 KiB, free: 1919.4 MiB)
19:49:00.896 INFO SparkContext - Created broadcast 528 from broadcast at BamSink.java:76
19:49:00.898 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:49:00.898 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:49:00.898 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:49:00.914 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
19:49:00.915 INFO DAGScheduler - Registering RDD 1253 (mapToPair at SparkUtils.java:161) as input to shuffle 52
19:49:00.915 INFO DAGScheduler - Got job 197 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
19:49:00.915 INFO DAGScheduler - Final stage: ResultStage 261 (runJob at SparkHadoopWriter.scala:83)
19:49:00.915 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 260)
19:49:00.915 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 260)
19:49:00.915 INFO DAGScheduler - Submitting ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161), which has no missing parents
19:49:00.932 INFO MemoryStore - Block broadcast_529 stored as values in memory (estimated size 520.4 KiB, free 1915.9 MiB)
19:49:00.933 INFO MemoryStore - Block broadcast_529_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.7 MiB)
19:49:00.933 INFO BlockManagerInfo - Added broadcast_529_piece0 in memory on localhost:36125 (size: 166.1 KiB, free: 1919.2 MiB)
19:49:00.933 INFO SparkContext - Created broadcast 529 from broadcast at DAGScheduler.scala:1580
19:49:00.933 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
19:49:00.933 INFO TaskSchedulerImpl - Adding task set 260.0 with 1 tasks resource profile 0
19:49:00.934 INFO TaskSetManager - Starting task 0.0 in stage 260.0 (TID 316) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
19:49:00.934 INFO Executor - Running task 0.0 in stage 260.0 (TID 316)
19:49:00.964 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:49:00.979 INFO Executor - Finished task 0.0 in stage 260.0 (TID 316). 1148 bytes result sent to driver
19:49:00.979 INFO TaskSetManager - Finished task 0.0 in stage 260.0 (TID 316) in 45 ms on localhost (executor driver) (1/1)
19:49:00.979 INFO TaskSchedulerImpl - Removed TaskSet 260.0, whose tasks have all completed, from pool
19:49:00.979 INFO DAGScheduler - ShuffleMapStage 260 (mapToPair at SparkUtils.java:161) finished in 0.064 s
19:49:00.979 INFO DAGScheduler - looking for newly runnable stages
19:49:00.979 INFO DAGScheduler - running: HashSet()
19:49:00.979 INFO DAGScheduler - waiting: HashSet(ResultStage 261)
19:49:00.979 INFO DAGScheduler - failed: HashSet()
19:49:00.979 INFO DAGScheduler - Submitting ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91), which has no missing parents
19:49:00.986 INFO MemoryStore - Block broadcast_530 stored as values in memory (estimated size 241.4 KiB, free 1915.5 MiB)
19:49:00.990 INFO BlockManagerInfo - Removed broadcast_515_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:49:00.990 INFO MemoryStore - Block broadcast_530_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.8 MiB)
19:49:00.990 INFO BlockManagerInfo - Added broadcast_530_piece0 in memory on localhost:36125 (size: 67.0 KiB, free: 1919.2 MiB)
19:49:00.990 INFO BlockManagerInfo - Removed broadcast_521_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:49:00.991 INFO SparkContext - Created broadcast 530 from broadcast at DAGScheduler.scala:1580
19:49:00.991 INFO BlockManagerInfo - Removed broadcast_526_piece0 on localhost:36125 in memory (size: 50.2 KiB, free: 1919.3 MiB)
19:49:00.991 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
19:49:00.991 INFO TaskSchedulerImpl - Adding task set 261.0 with 1 tasks resource profile 0
19:49:00.991 INFO BlockManagerInfo - Removed broadcast_517_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:49:00.992 INFO TaskSetManager - Starting task 0.0 in stage 261.0 (TID 317) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
19:49:00.992 INFO BlockManagerInfo - Removed broadcast_522_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.3 MiB)
19:49:00.992 INFO Executor - Running task 0.0 in stage 261.0 (TID 317)
19:49:00.992 INFO BlockManagerInfo - Removed broadcast_520_piece0 on localhost:36125 in memory (size: 9.6 KiB, free: 1919.3 MiB)
19:49:00.993 INFO BlockManagerInfo - Removed broadcast_518_piece0 on localhost:36125 in memory (size: 166.1 KiB, free: 1919.5 MiB)
19:49:00.993 INFO BlockManagerInfo - Removed broadcast_523_piece0 on localhost:36125 in memory (size: 153.6 KiB, free: 1919.6 MiB)
19:49:00.993 INFO BlockManagerInfo - Removed broadcast_519_piece0 on localhost:36125 in memory (size: 67.0 KiB, free: 1919.7 MiB)
19:49:00.995 INFO BlockManagerInfo - Removed broadcast_524_piece0 on localhost:36125 in memory (size: 3.8 KiB, free: 1919.7 MiB)
19:49:00.997 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
19:49:00.997 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
19:49:01.008 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:49:01.008 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:49:01.008 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:49:01.008 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
19:49:01.008 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
19:49:01.008 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19:49:01.032 INFO FileOutputCommitter - Saved output of task 'attempt_20250715194900716822195926036861_1258_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest19400406876724989179.bam.parts/_temporary/0/task_20250715194900716822195926036861_1258_r_000000
19:49:01.032 INFO SparkHadoopMapRedUtil - attempt_20250715194900716822195926036861_1258_r_000000_0: Committed. Elapsed time: 0 ms.
19:49:01.033 INFO Executor - Finished task 0.0 in stage 261.0 (TID 317). 1858 bytes result sent to driver
19:49:01.033 INFO TaskSetManager - Finished task 0.0 in stage 261.0 (TID 317) in 41 ms on localhost (executor driver) (1/1)
19:49:01.033 INFO TaskSchedulerImpl - Removed TaskSet 261.0, whose tasks have all completed, from pool
19:49:01.033 INFO DAGScheduler - ResultStage 261 (runJob at SparkHadoopWriter.scala:83) finished in 0.053 s
19:49:01.033 INFO DAGScheduler - Job 197 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:01.033 INFO TaskSchedulerImpl - Killing all running tasks in stage 261: Stage finished
19:49:01.033 INFO DAGScheduler - Job 197 finished: runJob at SparkHadoopWriter.scala:83, took 0.119111 s
19:49:01.034 INFO SparkHadoopWriter - Start to commit write Job job_20250715194900716822195926036861_1258.
19:49:01.039 INFO SparkHadoopWriter - Write Job job_20250715194900716822195926036861_1258 committed. Elapsed time: 5 ms.
19:49:01.050 INFO HadoopFileSystemWrapper - Concatenating 3 parts to file:////tmp/ReadsSparkSinkUnitTest19400406876724989179.bam
19:49:01.054 INFO HadoopFileSystemWrapper - Concatenating to file:////tmp/ReadsSparkSinkUnitTest19400406876724989179.bam done
19:49:01.054 INFO IndexFileMerger - Merging .sbi files in temp directory file:////tmp/ReadsSparkSinkUnitTest19400406876724989179.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest19400406876724989179.bam.sbi
19:49:01.058 INFO IndexFileMerger - Done merging .sbi files
19:49:01.058 INFO IndexFileMerger - Merging .bai files in temp directory file:////tmp/ReadsSparkSinkUnitTest19400406876724989179.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest19400406876724989179.bam.bai
19:49:01.063 INFO IndexFileMerger - Done merging .bai files
19:49:01.064 INFO MemoryStore - Block broadcast_531 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
19:49:01.065 INFO MemoryStore - Block broadcast_531_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
19:49:01.065 INFO BlockManagerInfo - Added broadcast_531_piece0 in memory on localhost:36125 (size: 233.0 B, free: 1919.7 MiB)
19:49:01.065 INFO SparkContext - Created broadcast 531 from broadcast at BamSource.java:104
19:49:01.066 INFO MemoryStore - Block broadcast_532 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
19:49:01.072 INFO MemoryStore - Block broadcast_532_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
19:49:01.072 INFO BlockManagerInfo - Added broadcast_532_piece0 in memory on localhost:36125 (size: 50.2 KiB, free: 1919.7 MiB)
19:49:01.072 INFO SparkContext - Created broadcast 532 from newAPIHadoopFile at PathSplitSource.java:96
19:49:01.081 INFO FileInputFormat - Total input files to process : 1
19:49:01.096 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
19:49:01.096 INFO DAGScheduler - Got job 198 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
19:49:01.096 INFO DAGScheduler - Final stage: ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182)
19:49:01.096 INFO DAGScheduler - Parents of final stage: List()
19:49:01.096 INFO DAGScheduler - Missing parents: List()
19:49:01.096 INFO DAGScheduler - Submitting ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
19:49:01.102 INFO MemoryStore - Block broadcast_533 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
19:49:01.103 INFO MemoryStore - Block broadcast_533_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.8 MiB)
19:49:01.103 INFO BlockManagerInfo - Added broadcast_533_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.6 MiB)
19:49:01.103 INFO SparkContext - Created broadcast 533 from broadcast at DAGScheduler.scala:1580
19:49:01.103 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:49:01.103 INFO TaskSchedulerImpl - Adding task set 262.0 with 1 tasks resource profile 0
19:49:01.103 INFO TaskSetManager - Starting task 0.0 in stage 262.0 (TID 318) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:49:01.104 INFO Executor - Running task 0.0 in stage 262.0 (TID 318)
19:49:01.115 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19400406876724989179.bam:0+237038
19:49:01.119 INFO Executor - Finished task 0.0 in stage 262.0 (TID 318). 651483 bytes result sent to driver
19:49:01.120 INFO TaskSetManager - Finished task 0.0 in stage 262.0 (TID 318) in 17 ms on localhost (executor driver) (1/1)
19:49:01.120 INFO TaskSchedulerImpl - Removed TaskSet 262.0, whose tasks have all completed, from pool
19:49:01.121 INFO DAGScheduler - ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
19:49:01.121 INFO DAGScheduler - Job 198 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:01.121 INFO TaskSchedulerImpl - Killing all running tasks in stage 262: Stage finished
19:49:01.121 INFO DAGScheduler - Job 198 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.024862 s
19:49:01.130 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:49:01.130 INFO DAGScheduler - Got job 199 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:49:01.130 INFO DAGScheduler - Final stage: ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185)
19:49:01.130 INFO DAGScheduler - Parents of final stage: List()
19:49:01.130 INFO DAGScheduler - Missing parents: List()
19:49:01.130 INFO DAGScheduler - Submitting ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96), which has no missing parents
19:49:01.147 INFO MemoryStore - Block broadcast_534 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
19:49:01.148 INFO MemoryStore - Block broadcast_534_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
19:49:01.148 INFO BlockManagerInfo - Added broadcast_534_piece0 in memory on localhost:36125 (size: 153.6 KiB, free: 1919.5 MiB)
19:49:01.148 INFO SparkContext - Created broadcast 534 from broadcast at DAGScheduler.scala:1580
19:49:01.149 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:49:01.149 INFO TaskSchedulerImpl - Adding task set 263.0 with 1 tasks resource profile 0
19:49:01.149 INFO TaskSetManager - Starting task 0.0 in stage 263.0 (TID 319) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
19:49:01.149 INFO Executor - Running task 0.0 in stage 263.0 (TID 319)
19:49:01.178 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
19:49:01.188 INFO Executor - Finished task 0.0 in stage 263.0 (TID 319). 989 bytes result sent to driver
19:49:01.188 INFO TaskSetManager - Finished task 0.0 in stage 263.0 (TID 319) in 39 ms on localhost (executor driver) (1/1)
19:49:01.188 INFO TaskSchedulerImpl - Removed TaskSet 263.0, whose tasks have all completed, from pool
19:49:01.188 INFO DAGScheduler - ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
19:49:01.188 INFO DAGScheduler - Job 199 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:01.188 INFO TaskSchedulerImpl - Killing all running tasks in stage 263: Stage finished
19:49:01.189 INFO DAGScheduler - Job 199 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058752 s
19:49:01.192 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
19:49:01.192 INFO DAGScheduler - Got job 200 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
19:49:01.192 INFO DAGScheduler - Final stage: ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185)
19:49:01.192 INFO DAGScheduler - Parents of final stage: List()
19:49:01.192 INFO DAGScheduler - Missing parents: List()
19:49:01.192 INFO DAGScheduler - Submitting ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
19:49:01.198 INFO MemoryStore - Block broadcast_535 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
19:49:01.199 INFO MemoryStore - Block broadcast_535_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
19:49:01.199 INFO BlockManagerInfo - Added broadcast_535_piece0 in memory on localhost:36125 (size: 54.5 KiB, free: 1919.4 MiB)
19:49:01.199 INFO SparkContext - Created broadcast 535 from broadcast at DAGScheduler.scala:1580
19:49:01.199 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
19:49:01.199 INFO TaskSchedulerImpl - Adding task set 264.0 with 1 tasks resource profile 0
19:49:01.200 INFO TaskSetManager - Starting task 0.0 in stage 264.0 (TID 320) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
19:49:01.200 INFO Executor - Running task 0.0 in stage 264.0 (TID 320)
19:49:01.210 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19400406876724989179.bam:0+237038
19:49:01.213 INFO Executor - Finished task 0.0 in stage 264.0 (TID 320). 989 bytes result sent to driver
19:49:01.213 INFO TaskSetManager - Finished task 0.0 in stage 264.0 (TID 320) in 14 ms on localhost (executor driver) (1/1)
19:49:01.213 INFO TaskSchedulerImpl - Removed TaskSet 264.0, whose tasks have all completed, from pool
19:49:01.213 INFO DAGScheduler - ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.021 s
19:49:01.213 INFO DAGScheduler - Job 200 is finished. Cancelling potential speculative or zombie tasks for this job
19:49:01.213 INFO TaskSchedulerImpl - Killing all running tasks in stage 264: Stage finished
19:49:01.214 INFO DAGScheduler - Job 200 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.021722 s