20:13:41.335 INFO BlockManagerInfo - Removed broadcast_32_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1920.0 MiB)
20:13:41.339 INFO BlockManagerInfo - Removed broadcast_30_piece0 on localhost:35739 in memory (size: 4.7 KiB, free: 1920.0 MiB)
20:13:41.342 INFO BlockManagerInfo - Removed broadcast_25_piece0 on localhost:35739 in memory (size: 4.5 KiB, free: 1920.0 MiB)
20:13:41.344 INFO BlockManagerInfo - Removed broadcast_21_piece0 on localhost:35739 in memory (size: 2.4 KiB, free: 1920.0 MiB)
20:13:41.348 INFO BlockManagerInfo - Removed broadcast_27_piece0 on localhost:35739 in memory (size: 5.1 KiB, free: 1920.0 MiB)
20:13:41.352 INFO BlockManagerInfo - Removed broadcast_24_piece0 on localhost:35739 in memory (size: 4.3 KiB, free: 1920.0 MiB)
20:13:41.360 INFO BlockManagerInfo - Removed broadcast_26_piece0 on localhost:35739 in memory (size: 3.2 KiB, free: 1920.0 MiB)
20:13:41.361 INFO MiniDFSCluster - starting cluster: numNameNodes=1, numDataNodes=1
20:13:41.683 INFO NameNode - Formatting using clusterid: testClusterID
20:13:41.695 INFO FSEditLog - Edit logging is async:true
20:13:41.713 INFO FSNamesystem - KeyProvider: null
20:13:41.714 INFO FSNamesystem - fsLock is fair: true
20:13:41.714 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
20:13:41.715 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
20:13:41.715 INFO FSNamesystem - supergroup = supergroup
20:13:41.715 INFO FSNamesystem - isPermissionEnabled = true
20:13:41.715 INFO FSNamesystem - isStoragePolicyEnabled = true
20:13:41.715 INFO FSNamesystem - HA Enabled: false
20:13:41.752 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
20:13:41.757 INFO deprecation - hadoop.configured.node.mapping is deprecated. Instead, use net.topology.configured.node.mapping
20:13:41.758 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
20:13:41.758 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
20:13:41.760 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
20:13:41.760 INFO BlockManager - The block deletion will start around 2025 Feb 10 20:13:41
20:13:41.761 INFO GSet - Computing capacity for map BlocksMap
20:13:41.761 INFO GSet - VM type = 64-bit
20:13:41.762 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
20:13:41.762 INFO GSet - capacity = 2^23 = 8388608 entries
20:13:41.768 INFO BlockManager - Storage policy satisfier is disabled
20:13:41.768 INFO BlockManager - dfs.block.access.token.enable = false
20:13:41.772 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
20:13:41.772 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
20:13:41.772 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
20:13:41.773 INFO BlockManager - defaultReplication = 1
20:13:41.773 INFO BlockManager - maxReplication = 512
20:13:41.773 INFO BlockManager - minReplication = 1
20:13:41.773 INFO BlockManager - maxReplicationStreams = 2
20:13:41.773 INFO BlockManager - redundancyRecheckInterval = 3000ms
20:13:41.773 INFO BlockManager - encryptDataTransfer = false
20:13:41.773 INFO BlockManager - maxNumBlocksToLog = 1000
20:13:41.796 INFO FSDirectory - GLOBAL serial map: bits=29 maxEntries=536870911
20:13:41.796 INFO FSDirectory - USER serial map: bits=24 maxEntries=16777215
20:13:41.796 INFO FSDirectory - GROUP serial map: bits=24 maxEntries=16777215
20:13:41.796 INFO FSDirectory - XATTR serial map: bits=24 maxEntries=16777215
20:13:41.804 INFO GSet - Computing capacity for map INodeMap
20:13:41.804 INFO GSet - VM type = 64-bit
20:13:41.804 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
20:13:41.804 INFO GSet - capacity = 2^22 = 4194304 entries
20:13:41.805 INFO FSDirectory - ACLs enabled? true
20:13:41.805 INFO FSDirectory - POSIX ACL inheritance enabled? true
20:13:41.805 INFO FSDirectory - XAttrs enabled? true
20:13:41.805 INFO NameNode - Caching file names occurring more than 10 times
20:13:41.809 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
20:13:41.810 INFO SnapshotManager - SkipList is disabled
20:13:41.814 INFO GSet - Computing capacity for map cachedBlocks
20:13:41.814 INFO GSet - VM type = 64-bit
20:13:41.814 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
20:13:41.814 INFO GSet - capacity = 2^20 = 1048576 entries
20:13:41.820 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
20:13:41.820 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
20:13:41.820 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
20:13:41.822 INFO FSNamesystem - Retry cache on namenode is enabled
20:13:41.822 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
20:13:41.823 INFO GSet - Computing capacity for map NameNodeRetryCache
20:13:41.823 INFO GSet - VM type = 64-bit
20:13:41.823 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
20:13:41.823 INFO GSet - capacity = 2^17 = 131072 entries
20:13:41.836 INFO FSImage - Allocated new BlockPoolId: BP-488470852-10.1.0.79-1739218421831
20:13:41.843 INFO Storage - Storage directory /tmp/minicluster_storage10361427482595794971/name-0-1 has been successfully formatted.
20:13:41.844 INFO Storage - Storage directory /tmp/minicluster_storage10361427482595794971/name-0-2 has been successfully formatted.
20:13:41.861 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage10361427482595794971/name-0-2/current/fsimage.ckpt_0000000000000000000 using no compression
20:13:41.861 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage10361427482595794971/name-0-1/current/fsimage.ckpt_0000000000000000000 using no compression
20:13:41.934 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage10361427482595794971/name-0-1/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
20:13:41.934 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage10361427482595794971/name-0-2/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
20:13:41.945 INFO NNStorageRetentionManager - Going to retain 1 images with txid >= 0
20:13:41.965 INFO FSNamesystem - Stopping services started for active state
20:13:41.965 INFO FSNamesystem - Stopping services started for standby state
20:13:41.966 INFO NameNode - createNameNode []
20:13:42.006 WARN MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
20:13:42.014 INFO MetricsSystemImpl - Scheduled Metric snapshot period at 10 second(s).
20:13:42.014 INFO MetricsSystemImpl - NameNode metrics system started
20:13:42.017 INFO NameNodeUtils - fs.defaultFS is hdfs://127.0.0.1:0
20:13:42.041 INFO JvmPauseMonitor - Starting JVM pause monitor
20:13:42.053 INFO DFSUtil - Filter initializers set : org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
20:13:42.058 INFO DFSUtil - Starting Web-server for hdfs at: http://localhost:0
20:13:42.069 INFO log - Logging initialized @26916ms to org.eclipse.jetty.util.log.Slf4jLog
20:13:42.148 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
20:13:42.153 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
20:13:42.158 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
20:13:42.160 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
20:13:42.160 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
20:13:42.162 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context hdfs
20:13:42.162 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context static
20:13:42.196 INFO HttpServer2 - addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
20:13:42.200 INFO HttpServer2 - Jetty bound to port 41679
20:13:42.201 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
20:13:42.228 INFO session - DefaultSessionIdManager workerName=node0
20:13:42.228 INFO session - No SessionScavenger set, using defaults
20:13:42.230 INFO session - node0 Scavenging every 600000ms
20:13:42.242 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
20:13:42.243 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@122de194{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
20:13:42.407 INFO ContextHandler - Started o.e.j.w.WebAppContext@715ab10f{hdfs,/,file:///tmp/jetty-localhost-41679-hadoop-hdfs-3_3_6-tests_jar-_-any-3649779739200857674/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/hdfs}
20:13:42.411 INFO AbstractConnector - Started ServerConnector@5119e01a{HTTP/1.1, (http/1.1)}{localhost:41679}
20:13:42.411 INFO Server - Started @27258ms
20:13:42.415 INFO FSEditLog - Edit logging is async:true
20:13:42.425 INFO FSNamesystem - KeyProvider: null
20:13:42.425 INFO FSNamesystem - fsLock is fair: true
20:13:42.425 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
20:13:42.425 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
20:13:42.425 INFO FSNamesystem - supergroup = supergroup
20:13:42.425 INFO FSNamesystem - isPermissionEnabled = true
20:13:42.425 INFO FSNamesystem - isStoragePolicyEnabled = true
20:13:42.425 INFO FSNamesystem - HA Enabled: false
20:13:42.425 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
20:13:42.426 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
20:13:42.426 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
20:13:42.426 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
20:13:42.426 INFO BlockManager - The block deletion will start around 2025 Feb 10 20:13:42
20:13:42.426 INFO GSet - Computing capacity for map BlocksMap
20:13:42.426 INFO GSet - VM type = 64-bit
20:13:42.426 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
20:13:42.426 INFO GSet - capacity = 2^23 = 8388608 entries
20:13:42.427 INFO BlockManager - Storage policy satisfier is disabled
20:13:42.427 INFO BlockManager - dfs.block.access.token.enable = false
20:13:42.427 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
20:13:42.427 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
20:13:42.427 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
20:13:42.427 INFO BlockManager - defaultReplication = 1
20:13:42.427 INFO BlockManager - maxReplication = 512
20:13:42.427 INFO BlockManager - minReplication = 1
20:13:42.427 INFO BlockManager - maxReplicationStreams = 2
20:13:42.427 INFO BlockManager - redundancyRecheckInterval = 3000ms
20:13:42.427 INFO BlockManager - encryptDataTransfer = false
20:13:42.427 INFO BlockManager - maxNumBlocksToLog = 1000
20:13:42.428 INFO GSet - Computing capacity for map INodeMap
20:13:42.428 INFO GSet - VM type = 64-bit
20:13:42.428 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
20:13:42.428 INFO GSet - capacity = 2^22 = 4194304 entries
20:13:42.428 INFO FSDirectory - ACLs enabled? true
20:13:42.429 INFO FSDirectory - POSIX ACL inheritance enabled? true
20:13:42.429 INFO FSDirectory - XAttrs enabled? true
20:13:42.429 INFO NameNode - Caching file names occurring more than 10 times
20:13:42.429 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
20:13:42.429 INFO SnapshotManager - SkipList is disabled
20:13:42.429 INFO GSet - Computing capacity for map cachedBlocks
20:13:42.429 INFO GSet - VM type = 64-bit
20:13:42.429 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
20:13:42.429 INFO GSet - capacity = 2^20 = 1048576 entries
20:13:42.429 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
20:13:42.429 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
20:13:42.429 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
20:13:42.429 INFO FSNamesystem - Retry cache on namenode is enabled
20:13:42.429 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
20:13:42.429 INFO GSet - Computing capacity for map NameNodeRetryCache
20:13:42.429 INFO GSet - VM type = 64-bit
20:13:42.429 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
20:13:42.429 INFO GSet - capacity = 2^17 = 131072 entries
20:13:42.433 INFO Storage - Lock on /tmp/minicluster_storage10361427482595794971/name-0-1/in_use.lock acquired by nodename 14919@fv-az1120-225
20:13:42.434 INFO Storage - Lock on /tmp/minicluster_storage10361427482595794971/name-0-2/in_use.lock acquired by nodename 14919@fv-az1120-225
20:13:42.435 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage10361427482595794971/name-0-1/current
20:13:42.435 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage10361427482595794971/name-0-2/current
20:13:42.435 INFO FSImage - No edit log streams selected.
20:13:42.435 INFO FSImage - Planning to load image: FSImageFile(file=/tmp/minicluster_storage10361427482595794971/name-0-1/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
20:13:42.453 INFO FSImageFormatPBINode - Loading 1 INodes.
20:13:42.454 INFO FSImageFormatPBINode - Successfully loaded 1 inodes
20:13:42.456 INFO FSImageFormatPBINode - Completed update blocks map and name cache, total waiting duration 0ms.
20:13:42.457 INFO FSImageFormatProtobuf - Loaded FSImage in 0 seconds.
20:13:42.457 INFO FSImage - Loaded image for txid 0 from /tmp/minicluster_storage10361427482595794971/name-0-1/current/fsimage_0000000000000000000
20:13:42.460 INFO FSNamesystem - Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
20:13:42.460 INFO FSEditLog - Starting log segment at 1
20:13:42.469 INFO NameCache - initialized with 0 entries 0 lookups
20:13:42.469 INFO FSNamesystem - Finished loading FSImage in 38 msecs
20:13:42.530 INFO NameNode - RPC server is binding to localhost:0
20:13:42.530 INFO NameNode - Enable NameNode state context:false
20:13:42.533 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
20:13:42.539 INFO Server - Listener at localhost:40199
20:13:42.539 INFO Server - Starting Socket Reader #1 for port 0
20:13:42.559 INFO NameNode - Clients are to use localhost:40199 to access this namenode/service.
20:13:42.561 INFO FSNamesystem - Registered FSNamesystemState, ReplicatedBlocksState and ECBlockGroupsState MBeans.
20:13:42.578 INFO LeaseManager - Number of blocks under construction: 0
20:13:42.584 INFO DatanodeAdminDefaultMonitor - Initialized the Default Decommission and Maintenance monitor
20:13:42.584 INFO BlockManager - Start MarkedDeleteBlockScrubber thread
20:13:42.585 INFO BlockManager - initializing replication queues
20:13:42.586 INFO StateChange - STATE* Leaving safe mode after 0 secs
20:13:42.586 INFO StateChange - STATE* Network topology has 0 racks and 0 datanodes
20:13:42.586 INFO StateChange - STATE* UnderReplicatedBlocks has 0 blocks
20:13:42.591 INFO BlockManager - Total number of blocks = 0
20:13:42.591 INFO BlockManager - Number of invalid blocks = 0
20:13:42.591 INFO BlockManager - Number of under-replicated blocks = 0
20:13:42.591 INFO BlockManager - Number of over-replicated blocks = 0
20:13:42.591 INFO BlockManager - Number of blocks being written = 0
20:13:42.591 INFO StateChange - STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 5 msec
20:13:42.603 INFO Server - IPC Server Responder: starting
20:13:42.606 INFO Server - IPC Server listener on 0: starting
20:13:42.608 INFO NameNode - NameNode RPC up at: localhost/127.0.0.1:40199
20:13:42.609 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
20:13:42.610 INFO FSNamesystem - Starting services required for active state
20:13:42.610 INFO FSDirectory - Initializing quota with 12 thread(s)
20:13:42.612 INFO FSDirectory - Quota initialization completed in 1 milliseconds
name space=1
storage space=0
storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0, PROVIDED=0
20:13:42.614 INFO CacheReplicationMonitor - Starting CacheReplicationMonitor with interval 30000 milliseconds
20:13:42.620 INFO MiniDFSCluster - Starting DataNode 0 with dfs.datanode.data.dir: [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data1,[DISK]file:/tmp/minicluster_storage10361427482595794971/data/data2
20:13:42.629 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data1
20:13:42.636 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data2
20:13:42.648 INFO MetricsSystemImpl - DataNode metrics system started (again)
20:13:42.651 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
20:13:42.653 INFO BlockScanner - Initialized block scanner with targetBytesPerSec 1048576
20:13:42.656 INFO DataNode - Configured hostname is 127.0.0.1
20:13:42.656 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
20:13:42.657 INFO DataNode - Starting DataNode with maxLockedMemory = 0
20:13:42.661 INFO DataNode - Opened streaming server at /127.0.0.1:38353
20:13:42.662 INFO DataNode - Balancing bandwidth is 104857600 bytes/s
20:13:42.662 INFO DataNode - Number threads for balancing is 100
20:13:42.667 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
20:13:42.667 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
20:13:42.669 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
20:13:42.669 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
20:13:42.669 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
20:13:42.671 INFO HttpServer2 - Jetty bound to port 44547
20:13:42.671 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
20:13:42.672 INFO session - DefaultSessionIdManager workerName=node0
20:13:42.672 INFO session - No SessionScavenger set, using defaults
20:13:42.672 INFO session - node0 Scavenging every 660000ms
20:13:42.673 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@641c872d{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
20:13:42.768 INFO ContextHandler - Started o.e.j.w.WebAppContext@30a0c356{datanode,/,file:///tmp/jetty-localhost-44547-hadoop-hdfs-3_3_6-tests_jar-_-any-13659211499471193587/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/datanode}
20:13:42.769 INFO AbstractConnector - Started ServerConnector@4cf22d79{HTTP/1.1, (http/1.1)}{localhost:44547}
20:13:42.769 INFO Server - Started @27617ms
20:13:42.773 WARN DatanodeHttpServer - Got null for restCsrfPreventionFilter - will not do any filtering.
20:13:42.774 INFO DatanodeHttpServer - Listening HTTP traffic on /127.0.0.1:40137
20:13:42.774 INFO JvmPauseMonitor - Starting JVM pause monitor
20:13:42.775 INFO DataNode - dnUserName = runner
20:13:42.775 INFO DataNode - supergroup = supergroup
20:13:42.782 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
20:13:42.782 INFO Server - Listener at localhost:36915
20:13:42.783 INFO Server - Starting Socket Reader #1 for port 0
20:13:42.786 INFO DataNode - Opened IPC server at /127.0.0.1:36915
20:13:42.801 INFO DataNode - Refresh request received for nameservices: null
20:13:42.801 INFO DataNode - Starting BPOfferServices for nameservices: <default>
20:13:42.808 INFO DataNode - Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:40199 starting to offer service
20:13:42.809 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
20:13:42.810 INFO Server - IPC Server Responder: starting
20:13:42.810 INFO Server - IPC Server listener on 0: starting
20:13:42.911 INFO DataNode - Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:40199
20:13:42.912 INFO Storage - Using 2 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=2, dataDirs=2)
20:13:42.914 INFO Storage - Lock on /tmp/minicluster_storage10361427482595794971/data/data1/in_use.lock acquired by nodename 14919@fv-az1120-225
20:13:42.914 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data1 is not formatted for namespace 542214496. Formatting...
20:13:42.915 INFO Storage - Generated new storageID DS-4c0b8763-f3e6-435a-a82e-814ec40b7756 for directory /tmp/minicluster_storage10361427482595794971/data/data1
20:13:42.917 INFO Storage - Lock on /tmp/minicluster_storage10361427482595794971/data/data2/in_use.lock acquired by nodename 14919@fv-az1120-225
20:13:42.917 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data2 is not formatted for namespace 542214496. Formatting...
20:13:42.917 INFO Storage - Generated new storageID DS-1417f4ef-4a89-4151-ba1a-6e87ea20fd13 for directory /tmp/minicluster_storage10361427482595794971/data/data2
20:13:42.917 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
20:13:42.921 INFO MiniDFSCluster - dnInfo.length != numDataNodes
20:13:42.921 INFO MiniDFSCluster - Waiting for cluster to become active
20:13:42.932 INFO Storage - Analyzing storage directories for bpid BP-488470852-10.1.0.79-1739218421831
20:13:42.932 INFO Storage - Locking is disabled for /tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831
20:13:42.933 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data1 and block pool id BP-488470852-10.1.0.79-1739218421831 is not formatted. Formatting ...
20:13:42.933 INFO Storage - Formatting block pool BP-488470852-10.1.0.79-1739218421831 directory /tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current
20:13:42.946 INFO Storage - Analyzing storage directories for bpid BP-488470852-10.1.0.79-1739218421831
20:13:42.946 INFO Storage - Locking is disabled for /tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831
20:13:42.946 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data2 and block pool id BP-488470852-10.1.0.79-1739218421831 is not formatted. Formatting ...
20:13:42.946 INFO Storage - Formatting block pool BP-488470852-10.1.0.79-1739218421831 directory /tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current
20:13:42.947 INFO DataNode - Setting up storage: nsid=542214496;bpid=BP-488470852-10.1.0.79-1739218421831;lv=-57;nsInfo=lv=-66;cid=testClusterID;nsid=542214496;c=1739218421831;bpid=BP-488470852-10.1.0.79-1739218421831;dnuuid=null
20:13:42.948 INFO DataNode - Generated and persisted new Datanode UUID e05b3ae3-c8c8-405b-9911-7c0919b02d43
20:13:42.956 INFO FsDatasetImpl - The datanode lock is a read write lock
20:13:42.978 INFO FsDatasetImpl - Added new volume: DS-4c0b8763-f3e6-435a-a82e-814ec40b7756
20:13:42.978 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data1, StorageType: DISK
20:13:42.979 INFO FsDatasetImpl - Added new volume: DS-1417f4ef-4a89-4151-ba1a-6e87ea20fd13
20:13:42.979 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage10361427482595794971/data/data2, StorageType: DISK
20:13:42.982 INFO MemoryMappableBlockLoader - Initializing cache loader: MemoryMappableBlockLoader.
20:13:42.985 INFO FsDatasetImpl - Registered FSDatasetState MBean
20:13:42.988 INFO FsDatasetImpl - Adding block pool BP-488470852-10.1.0.79-1739218421831
20:13:42.988 INFO FsDatasetImpl - Scanning block pool BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data1...
20:13:42.988 INFO FsDatasetImpl - Scanning block pool BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data2...
20:13:42.992 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current, will proceed with Du for space computation calculation,
20:13:42.992 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current, will proceed with Du for space computation calculation,
20:13:43.016 INFO FsDatasetImpl - Time taken to scan block pool BP-488470852-10.1.0.79-1739218421831 on /tmp/minicluster_storage10361427482595794971/data/data1: 28ms
20:13:43.017 INFO FsDatasetImpl - Time taken to scan block pool BP-488470852-10.1.0.79-1739218421831 on /tmp/minicluster_storage10361427482595794971/data/data2: 29ms
20:13:43.018 INFO FsDatasetImpl - Total time to scan all replicas for block pool BP-488470852-10.1.0.79-1739218421831: 29ms
20:13:43.018 INFO FsDatasetImpl - Adding replicas to map for block pool BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data1...
20:13:43.018 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/replicas doesn't exist
20:13:43.019 INFO FsDatasetImpl - Adding replicas to map for block pool BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data2...
20:13:43.019 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/replicas doesn't exist
20:13:43.019 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data2: 1ms
20:13:43.019 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data1: 1ms
20:13:43.019 INFO FsDatasetImpl - Total time to add all replicas to map for block pool BP-488470852-10.1.0.79-1739218421831: 2ms
20:13:43.020 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage10361427482595794971/data/data1
20:13:43.022 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
20:13:43.023 INFO MiniDFSCluster - dnInfo.length != numDataNodes
20:13:43.023 INFO MiniDFSCluster - Waiting for cluster to become active
20:13:43.024 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage10361427482595794971/data/data1
20:13:43.025 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage10361427482595794971/data/data2
20:13:43.025 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage10361427482595794971/data/data2
20:13:43.027 INFO VolumeScanner - Now scanning bpid BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data2
20:13:43.027 INFO VolumeScanner - Now scanning bpid BP-488470852-10.1.0.79-1739218421831 on volume /tmp/minicluster_storage10361427482595794971/data/data1
20:13:43.027 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10361427482595794971/data/data1, DS-4c0b8763-f3e6-435a-a82e-814ec40b7756): finished scanning block pool BP-488470852-10.1.0.79-1739218421831
20:13:43.027 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10361427482595794971/data/data2, DS-1417f4ef-4a89-4151-ba1a-6e87ea20fd13): finished scanning block pool BP-488470852-10.1.0.79-1739218421831
20:13:43.028 WARN DirectoryScanner - dfs.datanode.directoryscan.throttle.limit.ms.per.sec set to value above 1000 ms/sec. Assuming default value of -1
20:13:43.028 INFO DirectoryScanner - Periodic Directory Tree Verification scan starting in 8242182ms with interval of 21600000ms and throttle limit of -1ms/s
20:13:43.031 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10361427482595794971/data/data2, DS-1417f4ef-4a89-4151-ba1a-6e87ea20fd13): no suitable block pools found to scan. Waiting 1814399995 ms.
20:13:43.031 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage10361427482595794971/data/data1, DS-4c0b8763-f3e6-435a-a82e-814ec40b7756): no suitable block pools found to scan. Waiting 1814399996 ms.
20:13:43.032 INFO DataNode - Block pool BP-488470852-10.1.0.79-1739218421831 (Datanode Uuid e05b3ae3-c8c8-405b-9911-7c0919b02d43) service to localhost/127.0.0.1:40199 beginning handshake with NN
20:13:43.042 INFO StateChange - BLOCK* registerDatanode: from DatanodeRegistration(127.0.0.1:38353, datanodeUuid=e05b3ae3-c8c8-405b-9911-7c0919b02d43, infoPort=40137, infoSecurePort=0, ipcPort=36915, storageInfo=lv=-57;cid=testClusterID;nsid=542214496;c=1739218421831) storage e05b3ae3-c8c8-405b-9911-7c0919b02d43
20:13:43.043 INFO NetworkTopology - Adding a new node: /default-rack/127.0.0.1:38353
20:13:43.043 INFO BlockReportLeaseManager - Registered DN e05b3ae3-c8c8-405b-9911-7c0919b02d43 (127.0.0.1:38353).
20:13:43.046 INFO DataNode - Block pool BP-488470852-10.1.0.79-1739218421831 (Datanode Uuid e05b3ae3-c8c8-405b-9911-7c0919b02d43) service to localhost/127.0.0.1:40199 successfully registered with NN
20:13:43.047 INFO DataNode - For namenode localhost/127.0.0.1:40199 using BLOCKREPORT_INTERVAL of 21600000msecs CACHEREPORT_INTERVAL of 10000msecs Initial delay: 0msecs; heartBeatInterval=3000
20:13:43.047 INFO DataNode - Starting IBR Task Handler.
20:13:43.055 INFO DatanodeDescriptor - Adding new storage ID DS-4c0b8763-f3e6-435a-a82e-814ec40b7756 for DN 127.0.0.1:38353
20:13:43.055 INFO DatanodeDescriptor - Adding new storage ID DS-1417f4ef-4a89-4151-ba1a-6e87ea20fd13 for DN 127.0.0.1:38353
20:13:43.060 INFO DataNode - After receiving heartbeat response, updating state of namenode localhost:40199 to active
20:13:43.069 INFO BlockStateChange - BLOCK* processReport 0xc70d3a6d0567f192 with lease ID 0xf6236b121e26ed06: Processing first storage report for DS-1417f4ef-4a89-4151-ba1a-6e87ea20fd13 from datanode DatanodeRegistration(127.0.0.1:38353, datanodeUuid=e05b3ae3-c8c8-405b-9911-7c0919b02d43, infoPort=40137, infoSecurePort=0, ipcPort=36915, storageInfo=lv=-57;cid=testClusterID;nsid=542214496;c=1739218421831)
20:13:43.069 INFO BlockStateChange - BLOCK* processReport 0xc70d3a6d0567f192 with lease ID 0xf6236b121e26ed06: from storage DS-1417f4ef-4a89-4151-ba1a-6e87ea20fd13 node DatanodeRegistration(127.0.0.1:38353, datanodeUuid=e05b3ae3-c8c8-405b-9911-7c0919b02d43, infoPort=40137, infoSecurePort=0, ipcPort=36915, storageInfo=lv=-57;cid=testClusterID;nsid=542214496;c=1739218421831), blocks: 0, hasStaleStorage: true, processing time: 1 msecs, invalidatedBlocks: 0
20:13:43.070 INFO BlockStateChange - BLOCK* processReport 0xc70d3a6d0567f192 with lease ID 0xf6236b121e26ed06: Processing first storage report for DS-4c0b8763-f3e6-435a-a82e-814ec40b7756 from datanode DatanodeRegistration(127.0.0.1:38353, datanodeUuid=e05b3ae3-c8c8-405b-9911-7c0919b02d43, infoPort=40137, infoSecurePort=0, ipcPort=36915, storageInfo=lv=-57;cid=testClusterID;nsid=542214496;c=1739218421831)
20:13:43.070 INFO BlockStateChange - BLOCK* processReport 0xc70d3a6d0567f192 with lease ID 0xf6236b121e26ed06: from storage DS-4c0b8763-f3e6-435a-a82e-814ec40b7756 node DatanodeRegistration(127.0.0.1:38353, datanodeUuid=e05b3ae3-c8c8-405b-9911-7c0919b02d43, infoPort=40137, infoSecurePort=0, ipcPort=36915, storageInfo=lv=-57;cid=testClusterID;nsid=542214496;c=1739218421831), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
20:13:43.079 INFO DataNode - Successfully sent block report 0xc70d3a6d0567f192 with lease ID 0xf6236b121e26ed06 to namenode: localhost/127.0.0.1:40199, containing 2 storage report(s), of which we sent 2. The reports had 0 total blocks and used 1 RPC(s). This took 2 msecs to generate and 16 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
20:13:43.079 INFO DataNode - Got finalize command for block pool BP-488470852-10.1.0.79-1739218421831
20:13:43.124 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
20:13:43.127 INFO MiniDFSCluster - Cluster is active
20:13:43.190 INFO MemoryStore - Block broadcast_34 stored as values in memory (estimated size 297.9 KiB, free 1919.7 MiB)
20:13:43.212 INFO MemoryStore - Block broadcast_34_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.6 MiB)
20:13:43.212 INFO BlockManagerInfo - Added broadcast_34_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:13:43.213 INFO SparkContext - Created broadcast 34 from newAPIHadoopFile at PathSplitSource.java:96
20:13:43.282 INFO MemoryStore - Block broadcast_35 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
20:13:43.293 INFO MemoryStore - Block broadcast_35_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:13:43.294 INFO BlockManagerInfo - Added broadcast_35_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:13:43.294 INFO SparkContext - Created broadcast 35 from newAPIHadoopFile at PathSplitSource.java:96
20:13:43.358 INFO FileInputFormat - Total input files to process : 1
20:13:43.376 INFO MemoryStore - Block broadcast_36 stored as values in memory (estimated size 160.7 KiB, free 1919.1 MiB)
20:13:43.382 INFO MemoryStore - Block broadcast_36_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.1 MiB)
20:13:43.382 INFO BlockManagerInfo - Added broadcast_36_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:13:43.383 INFO SparkContext - Created broadcast 36 from broadcast at ReadsSparkSink.java:133
20:13:43.396 INFO MemoryStore - Block broadcast_37 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
20:13:43.403 INFO MemoryStore - Block broadcast_37_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
20:13:43.403 INFO BlockManagerInfo - Added broadcast_37_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:13:43.403 INFO SparkContext - Created broadcast 37 from broadcast at BamSink.java:76
20:13:43.423 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts dst=null perm=null proto=rpc
20:13:43.427 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:43.428 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:43.428 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:43.445 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:43.462 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:13:43.464 INFO DAGScheduler - Registering RDD 77 (mapToPair at SparkUtils.java:161) as input to shuffle 7
20:13:43.464 INFO DAGScheduler - Got job 20 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:13:43.464 INFO DAGScheduler - Final stage: ResultStage 30 (runJob at SparkHadoopWriter.scala:83)
20:13:43.464 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 29)
20:13:43.464 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 29)
20:13:43.465 INFO DAGScheduler - Submitting ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161), which has no missing parents
20:13:43.505 INFO MemoryStore - Block broadcast_38 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
20:13:43.507 INFO MemoryStore - Block broadcast_38_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
20:13:43.507 INFO BlockManagerInfo - Added broadcast_38_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.7 MiB)
20:13:43.508 INFO SparkContext - Created broadcast 38 from broadcast at DAGScheduler.scala:1580
20:13:43.508 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:13:43.508 INFO TaskSchedulerImpl - Adding task set 29.0 with 1 tasks resource profile 0
20:13:43.512 INFO TaskSetManager - Starting task 0.0 in stage 29.0 (TID 67) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:13:43.512 INFO Executor - Running task 0.0 in stage 29.0 (TID 67)
20:13:43.601 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:43.672 INFO Executor - Finished task 0.0 in stage 29.0 (TID 67). 1148 bytes result sent to driver
20:13:43.673 INFO TaskSetManager - Finished task 0.0 in stage 29.0 (TID 67) in 164 ms on localhost (executor driver) (1/1)
20:13:43.673 INFO TaskSchedulerImpl - Removed TaskSet 29.0, whose tasks have all completed, from pool
20:13:43.673 INFO DAGScheduler - ShuffleMapStage 29 (mapToPair at SparkUtils.java:161) finished in 0.206 s
20:13:43.673 INFO DAGScheduler - looking for newly runnable stages
20:13:43.674 INFO DAGScheduler - running: HashSet()
20:13:43.674 INFO DAGScheduler - waiting: HashSet(ResultStage 30)
20:13:43.674 INFO DAGScheduler - failed: HashSet()
20:13:43.674 INFO DAGScheduler - Submitting ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91), which has no missing parents
20:13:43.687 INFO MemoryStore - Block broadcast_39 stored as values in memory (estimated size 241.5 KiB, free 1918.1 MiB)
20:13:43.707 INFO MemoryStore - Block broadcast_39_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.0 MiB)
20:13:43.708 INFO BlockManagerInfo - Added broadcast_39_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.6 MiB)
20:13:43.709 INFO SparkContext - Created broadcast 39 from broadcast at DAGScheduler.scala:1580
20:13:43.709 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:13:43.709 INFO TaskSchedulerImpl - Adding task set 30.0 with 1 tasks resource profile 0
20:13:43.710 INFO TaskSetManager - Starting task 0.0 in stage 30.0 (TID 68) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:13:43.711 INFO Executor - Running task 0.0 in stage 30.0 (TID 68)
20:13:43.712 INFO BlockManagerInfo - Removed broadcast_22_piece0 on localhost:35739 in memory (size: 159.0 B, free: 1919.6 MiB)
20:13:43.715 INFO BlockManager - Removing RDD 47
20:13:43.720 INFO BlockManagerInfo - Removed broadcast_33_piece0 on localhost:35739 in memory (size: 4.8 KiB, free: 1919.7 MiB)
20:13:43.723 INFO BlockManagerInfo - Removed broadcast_28_piece0 on localhost:35739 in memory (size: 320.0 B, free: 1919.7 MiB)
20:13:43.724 INFO BlockManagerInfo - Removed broadcast_35_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:13:43.728 INFO BlockManagerInfo - Removed broadcast_31_piece0 on localhost:35739 in memory (size: 320.0 B, free: 1919.7 MiB)
20:13:43.734 INFO BlockManagerInfo - Removed broadcast_23_piece0 on localhost:35739 in memory (size: 465.0 B, free: 1919.7 MiB)
20:13:43.740 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:13:43.740 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:13:43.846 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:43.846 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:43.846 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:43.847 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:43.847 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:43.847 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:43.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:43.888 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:43.891 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:43.921 INFO StateChange - BLOCK* allocate blk_1073741825_1001, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/part-r-00000
20:13:43.955 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741825_1001 src: /127.0.0.1:51304 dest: /127.0.0.1:38353
20:13:43.981 INFO clienttrace - src: /127.0.0.1:51304, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741825_1001, duration(ns): 5292783
20:13:43.981 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
20:13:43.986 INFO FSNamesystem - BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/part-r-00000
20:13:44.389 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:44.390 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:13:44.393 INFO StateChange - BLOCK* allocate blk_1073741826_1002, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.sbi
20:13:44.395 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741826_1002 src: /127.0.0.1:51308 dest: /127.0.0.1:38353
20:13:44.396 INFO clienttrace - src: /127.0.0.1:51308, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741826_1002, duration(ns): 605782
20:13:44.396 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
20:13:44.397 INFO FSNamesystem - BLOCK* blk_1073741826_1002 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.sbi
20:13:44.798 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:44.802 INFO StateChange - BLOCK* allocate blk_1073741827_1003, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.bai
20:13:44.804 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741827_1003 src: /127.0.0.1:51316 dest: /127.0.0.1:38353
20:13:44.805 INFO clienttrace - src: /127.0.0.1:51316, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741827_1003, duration(ns): 687695
20:13:44.806 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741827_1003, type=LAST_IN_PIPELINE terminating
20:13:44.806 INFO FSNamesystem - BLOCK* blk_1073741827_1003 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.bai
20:13:45.208 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:45.211 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0 dst=null perm=null proto=rpc
20:13:45.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0 dst=null perm=null proto=rpc
20:13:45.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/task_202502102013434886432853265091581_0082_r_000000 dst=null perm=null proto=rpc
20:13:45.221 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/_temporary/attempt_202502102013434886432853265091581_0082_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/task_202502102013434886432853265091581_0082_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:45.223 INFO FileOutputCommitter - Saved output of task 'attempt_202502102013434886432853265091581_0082_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/task_202502102013434886432853265091581_0082_r_000000
20:13:45.223 INFO SparkHadoopMapRedUtil - attempt_202502102013434886432853265091581_0082_r_000000_0: Committed. Elapsed time: 9 ms.
20:13:45.225 INFO Executor - Finished task 0.0 in stage 30.0 (TID 68). 1858 bytes result sent to driver
20:13:45.226 INFO TaskSetManager - Finished task 0.0 in stage 30.0 (TID 68) in 1516 ms on localhost (executor driver) (1/1)
20:13:45.226 INFO TaskSchedulerImpl - Removed TaskSet 30.0, whose tasks have all completed, from pool
20:13:45.226 INFO DAGScheduler - ResultStage 30 (runJob at SparkHadoopWriter.scala:83) finished in 1.552 s
20:13:45.227 INFO DAGScheduler - Job 20 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:45.227 INFO TaskSchedulerImpl - Killing all running tasks in stage 30: Stage finished
20:13:45.227 INFO DAGScheduler - Job 20 finished: runJob at SparkHadoopWriter.scala:83, took 1.764710 s
20:13:45.229 INFO SparkHadoopWriter - Start to commit write Job job_202502102013434886432853265091581_0082.
20:13:45.232 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:13:45.234 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts dst=null perm=null proto=rpc
20:13:45.235 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/task_202502102013434886432853265091581_0082_r_000000 dst=null perm=null proto=rpc
20:13:45.236 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:45.238 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/task_202502102013434886432853265091581_0082_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:45.239 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:45.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/task_202502102013434886432853265091581_0082_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:45.242 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:13:45.244 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary/0/task_202502102013434886432853265091581_0082_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:45.251 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_temporary dst=null perm=null proto=rpc
20:13:45.254 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:45.256 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:45.259 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.spark-staging-82 dst=null perm=null proto=rpc
20:13:45.260 INFO SparkHadoopWriter - Write Job job_202502102013434886432853265091581_0082 committed. Elapsed time: 30 ms.
20:13:45.261 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:45.265 INFO StateChange - BLOCK* allocate blk_1073741828_1004, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/header
20:13:45.267 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741828_1004 src: /127.0.0.1:51332 dest: /127.0.0.1:38353
20:13:45.268 INFO clienttrace - src: /127.0.0.1:51332, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741828_1004, duration(ns): 636843
20:13:45.268 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741828_1004, type=LAST_IN_PIPELINE terminating
20:13:45.269 INFO FSNamesystem - BLOCK* blk_1073741828_1004 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/header
20:13:45.671 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:45.672 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:45.673 INFO StateChange - BLOCK* allocate blk_1073741829_1005, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/terminator
20:13:45.674 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741829_1005 src: /127.0.0.1:51336 dest: /127.0.0.1:38353
20:13:45.676 INFO clienttrace - src: /127.0.0.1:51336, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741829_1005, duration(ns): 581014
20:13:45.676 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741829_1005, type=LAST_IN_PIPELINE terminating
20:13:45.677 INFO FSNamesystem - BLOCK* blk_1073741829_1005 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/terminator
20:13:46.078 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:46.079 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts dst=null perm=null proto=rpc
20:13:46.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:46.085 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:46.090 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam
20:13:46.093 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:46.094 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.095 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:46.095 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam done
20:13:46.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.097 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi
20:13:46.098 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts dst=null perm=null proto=rpc
20:13:46.100 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:46.102 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:46.105 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:46.135 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:46.136 INFO StateChange - BLOCK* allocate blk_1073741830_1006, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi
20:13:46.137 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741830_1006 src: /127.0.0.1:51350 dest: /127.0.0.1:38353
20:13:46.139 INFO clienttrace - src: /127.0.0.1:51350, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741830_1006, duration(ns): 657562
20:13:46.139 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741830_1006, type=LAST_IN_PIPELINE terminating
20:13:46.140 INFO FSNamesystem - BLOCK* blk_1073741830_1006 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi
20:13:46.541 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:46.542 INFO IndexFileMerger - Done merging .sbi files
20:13:46.543 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai
20:13:46.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts dst=null perm=null proto=rpc
20:13:46.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:46.546 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:46.547 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:46.549 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:46.550 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:46.557 INFO StateChange - BLOCK* allocate blk_1073741831_1007, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai
20:13:46.558 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741831_1007 src: /127.0.0.1:51352 dest: /127.0.0.1:38353
20:13:46.560 INFO clienttrace - src: /127.0.0.1:51352, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741831_1007, duration(ns): 605476
20:13:46.560 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741831_1007, type=LAST_IN_PIPELINE terminating
20:13:46.561 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:46.562 INFO IndexFileMerger - Done merging .bai files
20:13:46.562 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.parts dst=null perm=null proto=rpc
20:13:46.572 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi dst=null perm=null proto=rpc
20:13:46.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi dst=null perm=null proto=rpc
20:13:46.582 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi dst=null perm=null proto=rpc
20:13:46.584 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:13:46.585 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.585 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.586 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.593 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:46.598 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:46.599 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:46.600 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi dst=null perm=null proto=rpc
20:13:46.601 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi dst=null perm=null proto=rpc
20:13:46.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.sbi dst=null perm=null proto=rpc
20:13:46.604 INFO MemoryStore - Block broadcast_40 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
20:13:46.605 INFO MemoryStore - Block broadcast_40_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
20:13:46.605 INFO BlockManagerInfo - Added broadcast_40_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.7 MiB)
20:13:46.605 INFO SparkContext - Created broadcast 40 from broadcast at BamSource.java:104
20:13:46.608 INFO MemoryStore - Block broadcast_41 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:13:46.615 INFO MemoryStore - Block broadcast_41_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:13:46.615 INFO BlockManagerInfo - Added broadcast_41_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:13:46.615 INFO SparkContext - Created broadcast 41 from newAPIHadoopFile at PathSplitSource.java:96
20:13:46.638 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.639 INFO FileInputFormat - Total input files to process : 1
20:13:46.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.669 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:13:46.670 INFO DAGScheduler - Got job 21 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:13:46.670 INFO DAGScheduler - Final stage: ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182)
20:13:46.670 INFO DAGScheduler - Parents of final stage: List()
20:13:46.670 INFO DAGScheduler - Missing parents: List()
20:13:46.670 INFO DAGScheduler - Submitting ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:46.682 INFO MemoryStore - Block broadcast_42 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
20:13:46.683 INFO MemoryStore - Block broadcast_42_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
20:13:46.683 INFO BlockManagerInfo - Added broadcast_42_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:13:46.684 INFO SparkContext - Created broadcast 42 from broadcast at DAGScheduler.scala:1580
20:13:46.684 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:46.684 INFO TaskSchedulerImpl - Adding task set 31.0 with 1 tasks resource profile 0
20:13:46.685 INFO TaskSetManager - Starting task 0.0 in stage 31.0 (TID 69) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:46.685 INFO Executor - Running task 0.0 in stage 31.0 (TID 69)
20:13:46.700 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam:0+237038
20:13:46.702 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.703 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.704 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.705 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.708 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:46.715 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:46.721 INFO Executor - Finished task 0.0 in stage 31.0 (TID 69). 651526 bytes result sent to driver
20:13:46.726 INFO TaskSetManager - Finished task 0.0 in stage 31.0 (TID 69) in 41 ms on localhost (executor driver) (1/1)
20:13:46.726 INFO TaskSchedulerImpl - Removed TaskSet 31.0, whose tasks have all completed, from pool
20:13:46.726 INFO DAGScheduler - ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.055 s
20:13:46.726 INFO DAGScheduler - Job 21 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:46.726 INFO TaskSchedulerImpl - Killing all running tasks in stage 31: Stage finished
20:13:46.726 INFO DAGScheduler - Job 21 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.057221 s
20:13:46.754 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:46.755 INFO DAGScheduler - Got job 22 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:46.755 INFO DAGScheduler - Final stage: ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185)
20:13:46.755 INFO DAGScheduler - Parents of final stage: List()
20:13:46.755 INFO DAGScheduler - Missing parents: List()
20:13:46.755 INFO DAGScheduler - Submitting ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:46.774 INFO MemoryStore - Block broadcast_43 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:13:46.776 INFO MemoryStore - Block broadcast_43_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
20:13:46.776 INFO BlockManagerInfo - Added broadcast_43_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:13:46.777 INFO SparkContext - Created broadcast 43 from broadcast at DAGScheduler.scala:1580
20:13:46.777 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:46.777 INFO TaskSchedulerImpl - Adding task set 32.0 with 1 tasks resource profile 0
20:13:46.778 INFO TaskSetManager - Starting task 0.0 in stage 32.0 (TID 70) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:13:46.778 INFO Executor - Running task 0.0 in stage 32.0 (TID 70)
20:13:46.813 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:46.879 INFO Executor - Finished task 0.0 in stage 32.0 (TID 70). 989 bytes result sent to driver
20:13:46.879 INFO TaskSetManager - Finished task 0.0 in stage 32.0 (TID 70) in 102 ms on localhost (executor driver) (1/1)
20:13:46.879 INFO TaskSchedulerImpl - Removed TaskSet 32.0, whose tasks have all completed, from pool
20:13:46.880 INFO DAGScheduler - ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.124 s
20:13:46.880 INFO DAGScheduler - Job 22 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:46.880 INFO TaskSchedulerImpl - Killing all running tasks in stage 32: Stage finished
20:13:46.880 INFO DAGScheduler - Job 22 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.125683 s
20:13:46.884 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:46.884 INFO DAGScheduler - Got job 23 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:46.884 INFO DAGScheduler - Final stage: ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185)
20:13:46.884 INFO DAGScheduler - Parents of final stage: List()
20:13:46.885 INFO DAGScheduler - Missing parents: List()
20:13:46.885 INFO DAGScheduler - Submitting ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:46.892 INFO MemoryStore - Block broadcast_44 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
20:13:46.893 INFO MemoryStore - Block broadcast_44_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
20:13:46.893 INFO BlockManagerInfo - Added broadcast_44_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.4 MiB)
20:13:46.894 INFO SparkContext - Created broadcast 44 from broadcast at DAGScheduler.scala:1580
20:13:46.894 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:46.894 INFO TaskSchedulerImpl - Adding task set 33.0 with 1 tasks resource profile 0
20:13:46.895 INFO TaskSetManager - Starting task 0.0 in stage 33.0 (TID 71) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:46.895 INFO Executor - Running task 0.0 in stage 33.0 (TID 71)
20:13:46.909 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam:0+237038
20:13:46.910 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.911 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam dst=null perm=null proto=rpc
20:13:46.912 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.913 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.913 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5db678e2-5249-4972-a410-c7901aa04b6b.bam.bai dst=null perm=null proto=rpc
20:13:46.915 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:46.918 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:46.919 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:46.921 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:46.924 INFO Executor - Finished task 0.0 in stage 33.0 (TID 71). 989 bytes result sent to driver
20:13:46.924 INFO TaskSetManager - Finished task 0.0 in stage 33.0 (TID 71) in 29 ms on localhost (executor driver) (1/1)
20:13:46.924 INFO TaskSchedulerImpl - Removed TaskSet 33.0, whose tasks have all completed, from pool
20:13:46.925 INFO DAGScheduler - ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.040 s
20:13:46.925 INFO DAGScheduler - Job 23 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:46.925 INFO TaskSchedulerImpl - Killing all running tasks in stage 33: Stage finished
20:13:46.925 INFO DAGScheduler - Job 23 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.040679 s
20:13:46.930 INFO MemoryStore - Block broadcast_45 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
20:13:46.937 INFO MemoryStore - Block broadcast_45_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:13:46.937 INFO BlockManagerInfo - Added broadcast_45_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:13:46.938 INFO SparkContext - Created broadcast 45 from newAPIHadoopFile at PathSplitSource.java:96
20:13:46.965 INFO MemoryStore - Block broadcast_46 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:13:46.987 INFO BlockManagerInfo - Removed broadcast_44_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.4 MiB)
20:13:46.991 INFO BlockManagerInfo - Removed broadcast_37_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:13:46.992 INFO BlockManagerInfo - Removed broadcast_34_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:13:46.993 INFO BlockManagerInfo - Removed broadcast_39_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.5 MiB)
20:13:46.995 INFO MemoryStore - Block broadcast_46_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.4 MiB)
20:13:46.995 INFO BlockManagerInfo - Removed broadcast_41_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:13:46.995 INFO BlockManagerInfo - Added broadcast_46_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:13:46.995 INFO SparkContext - Created broadcast 46 from newAPIHadoopFile at PathSplitSource.java:96
20:13:46.996 INFO BlockManagerInfo - Removed broadcast_38_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.7 MiB)
20:13:46.997 INFO BlockManagerInfo - Removed broadcast_40_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.7 MiB)
20:13:46.998 INFO BlockManagerInfo - Removed broadcast_43_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:13:47.000 INFO BlockManagerInfo - Removed broadcast_36_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:13:47.001 INFO BlockManagerInfo - Removed broadcast_42_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.9 MiB)
20:13:47.023 INFO FileInputFormat - Total input files to process : 1
20:13:47.026 INFO MemoryStore - Block broadcast_47 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
20:13:47.029 INFO MemoryStore - Block broadcast_47_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
20:13:47.030 INFO BlockManagerInfo - Added broadcast_47_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:13:47.030 INFO SparkContext - Created broadcast 47 from broadcast at ReadsSparkSink.java:133
20:13:47.032 INFO MemoryStore - Block broadcast_48 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
20:13:47.034 INFO MemoryStore - Block broadcast_48_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
20:13:47.034 INFO BlockManagerInfo - Added broadcast_48_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:13:47.035 INFO SparkContext - Created broadcast 48 from broadcast at BamSink.java:76
20:13:47.038 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts dst=null perm=null proto=rpc
20:13:47.038 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:47.038 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:47.038 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:47.040 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:47.047 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:13:47.047 INFO DAGScheduler - Registering RDD 102 (mapToPair at SparkUtils.java:161) as input to shuffle 8
20:13:47.048 INFO DAGScheduler - Got job 24 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:13:47.048 INFO DAGScheduler - Final stage: ResultStage 35 (runJob at SparkHadoopWriter.scala:83)
20:13:47.048 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 34)
20:13:47.048 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 34)
20:13:47.048 INFO DAGScheduler - Submitting ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161), which has no missing parents
20:13:47.068 INFO MemoryStore - Block broadcast_49 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
20:13:47.070 INFO MemoryStore - Block broadcast_49_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
20:13:47.070 INFO BlockManagerInfo - Added broadcast_49_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.7 MiB)
20:13:47.070 INFO SparkContext - Created broadcast 49 from broadcast at DAGScheduler.scala:1580
20:13:47.071 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:13:47.071 INFO TaskSchedulerImpl - Adding task set 34.0 with 1 tasks resource profile 0
20:13:47.072 INFO TaskSetManager - Starting task 0.0 in stage 34.0 (TID 72) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:13:47.072 INFO Executor - Running task 0.0 in stage 34.0 (TID 72)
20:13:47.120 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:47.181 INFO Executor - Finished task 0.0 in stage 34.0 (TID 72). 1148 bytes result sent to driver
20:13:47.182 INFO TaskSetManager - Finished task 0.0 in stage 34.0 (TID 72) in 111 ms on localhost (executor driver) (1/1)
20:13:47.182 INFO TaskSchedulerImpl - Removed TaskSet 34.0, whose tasks have all completed, from pool
20:13:47.182 INFO DAGScheduler - ShuffleMapStage 34 (mapToPair at SparkUtils.java:161) finished in 0.133 s
20:13:47.182 INFO DAGScheduler - looking for newly runnable stages
20:13:47.182 INFO DAGScheduler - running: HashSet()
20:13:47.182 INFO DAGScheduler - waiting: HashSet(ResultStage 35)
20:13:47.182 INFO DAGScheduler - failed: HashSet()
20:13:47.182 INFO DAGScheduler - Submitting ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91), which has no missing parents
20:13:47.195 INFO MemoryStore - Block broadcast_50 stored as values in memory (estimated size 241.5 KiB, free 1918.1 MiB)
20:13:47.196 INFO MemoryStore - Block broadcast_50_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.0 MiB)
20:13:47.196 INFO BlockManagerInfo - Added broadcast_50_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.7 MiB)
20:13:47.196 INFO SparkContext - Created broadcast 50 from broadcast at DAGScheduler.scala:1580
20:13:47.196 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:13:47.197 INFO TaskSchedulerImpl - Adding task set 35.0 with 1 tasks resource profile 0
20:13:47.197 INFO TaskSetManager - Starting task 0.0 in stage 35.0 (TID 73) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:13:47.198 INFO Executor - Running task 0.0 in stage 35.0 (TID 73)
20:13:47.206 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:13:47.206 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:13:47.233 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:47.233 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:47.233 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:47.234 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:47.234 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:47.234 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:47.236 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:47.237 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:47.238 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:47.242 INFO StateChange - BLOCK* allocate blk_1073741832_1008, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/part-r-00000
20:13:47.244 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741832_1008 src: /127.0.0.1:51394 dest: /127.0.0.1:38353
20:13:47.248 INFO clienttrace - src: /127.0.0.1:51394, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741832_1008, duration(ns): 3807244
20:13:47.248 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741832_1008, type=LAST_IN_PIPELINE terminating
20:13:47.250 INFO FSNamesystem - BLOCK* blk_1073741832_1008 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/part-r-00000
20:13:47.651 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:47.652 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:13:47.654 INFO StateChange - BLOCK* allocate blk_1073741833_1009, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.sbi
20:13:47.655 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741833_1009 src: /127.0.0.1:42978 dest: /127.0.0.1:38353
20:13:47.656 INFO clienttrace - src: /127.0.0.1:42978, dest: /127.0.0.1:38353, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741833_1009, duration(ns): 719732
20:13:47.657 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741833_1009, type=LAST_IN_PIPELINE terminating
20:13:47.657 INFO FSNamesystem - BLOCK* blk_1073741833_1009 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.sbi
20:13:48.058 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:48.061 INFO StateChange - BLOCK* allocate blk_1073741834_1010, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.bai
20:13:48.063 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741834_1010 src: /127.0.0.1:42994 dest: /127.0.0.1:38353
20:13:48.064 INFO clienttrace - src: /127.0.0.1:42994, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741834_1010, duration(ns): 674416
20:13:48.064 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741834_1010, type=LAST_IN_PIPELINE terminating
20:13:48.065 INFO FSNamesystem - BLOCK* blk_1073741834_1010 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.bai
20:13:48.466 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:48.468 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0 dst=null perm=null proto=rpc
20:13:48.469 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0 dst=null perm=null proto=rpc
20:13:48.469 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/task_202502102013473307569242168978075_0107_r_000000 dst=null perm=null proto=rpc
20:13:48.471 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/_temporary/attempt_202502102013473307569242168978075_0107_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/task_202502102013473307569242168978075_0107_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:48.471 INFO FileOutputCommitter - Saved output of task 'attempt_202502102013473307569242168978075_0107_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/task_202502102013473307569242168978075_0107_r_000000
20:13:48.471 INFO SparkHadoopMapRedUtil - attempt_202502102013473307569242168978075_0107_r_000000_0: Committed. Elapsed time: 2 ms.
20:13:48.472 INFO Executor - Finished task 0.0 in stage 35.0 (TID 73). 1858 bytes result sent to driver
20:13:48.474 INFO TaskSetManager - Finished task 0.0 in stage 35.0 (TID 73) in 1277 ms on localhost (executor driver) (1/1)
20:13:48.474 INFO TaskSchedulerImpl - Removed TaskSet 35.0, whose tasks have all completed, from pool
20:13:48.474 INFO DAGScheduler - ResultStage 35 (runJob at SparkHadoopWriter.scala:83) finished in 1.291 s
20:13:48.474 INFO DAGScheduler - Job 24 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:48.474 INFO TaskSchedulerImpl - Killing all running tasks in stage 35: Stage finished
20:13:48.474 INFO DAGScheduler - Job 24 finished: runJob at SparkHadoopWriter.scala:83, took 1.427733 s
20:13:48.476 INFO SparkHadoopWriter - Start to commit write Job job_202502102013473307569242168978075_0107.
20:13:48.477 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:13:48.478 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts dst=null perm=null proto=rpc
20:13:48.479 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/task_202502102013473307569242168978075_0107_r_000000 dst=null perm=null proto=rpc
20:13:48.479 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:48.480 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/task_202502102013473307569242168978075_0107_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:48.481 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:48.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/task_202502102013473307569242168978075_0107_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:48.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:13:48.483 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary/0/task_202502102013473307569242168978075_0107_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:48.484 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_temporary dst=null perm=null proto=rpc
20:13:48.485 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:48.486 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:48.487 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.spark-staging-107 dst=null perm=null proto=rpc
20:13:48.487 INFO SparkHadoopWriter - Write Job job_202502102013473307569242168978075_0107 committed. Elapsed time: 11 ms.
20:13:48.488 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:48.491 INFO StateChange - BLOCK* allocate blk_1073741835_1011, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/header
20:13:48.492 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741835_1011 src: /127.0.0.1:43008 dest: /127.0.0.1:38353
20:13:48.494 INFO clienttrace - src: /127.0.0.1:43008, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741835_1011, duration(ns): 563265
20:13:48.494 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741835_1011, type=LAST_IN_PIPELINE terminating
20:13:48.495 INFO FSNamesystem - BLOCK* blk_1073741835_1011 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/header
20:13:48.897 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:48.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:48.900 INFO StateChange - BLOCK* allocate blk_1073741836_1012, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/terminator
20:13:48.901 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741836_1012 src: /127.0.0.1:43020 dest: /127.0.0.1:38353
20:13:48.902 INFO clienttrace - src: /127.0.0.1:43020, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741836_1012, duration(ns): 536767
20:13:48.902 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741836_1012, type=LAST_IN_PIPELINE terminating
20:13:48.903 INFO FSNamesystem - BLOCK* blk_1073741836_1012 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/terminator
20:13:49.052 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741826_1002 replica FinalizedReplica, blk_1073741826_1002, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741826 for deletion
20:13:49.053 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741827_1003 replica FinalizedReplica, blk_1073741827_1003, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741827 for deletion
20:13:49.053 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741826_1002 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741826
20:13:49.054 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741827_1003 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741827
20:13:49.304 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:49.305 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts dst=null perm=null proto=rpc
20:13:49.307 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:49.309 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:49.309 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam
20:13:49.310 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:49.310 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.311 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:49.311 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam done
20:13:49.312 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.312 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi
20:13:49.313 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts dst=null perm=null proto=rpc
20:13:49.314 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:49.315 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:49.316 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:49.317 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
20:13:49.318 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:49.319 INFO StateChange - BLOCK* allocate blk_1073741837_1013, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi
20:13:49.320 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741837_1013 src: /127.0.0.1:43022 dest: /127.0.0.1:38353
20:13:49.322 INFO clienttrace - src: /127.0.0.1:43022, dest: /127.0.0.1:38353, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741837_1013, duration(ns): 657681
20:13:49.322 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741837_1013, type=LAST_IN_PIPELINE terminating
20:13:49.323 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:49.323 INFO IndexFileMerger - Done merging .sbi files
20:13:49.323 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai
20:13:49.324 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts dst=null perm=null proto=rpc
20:13:49.325 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:49.326 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:49.327 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:49.328 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:49.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:49.335 INFO StateChange - BLOCK* allocate blk_1073741838_1014, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai
20:13:49.336 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741838_1014 src: /127.0.0.1:43034 dest: /127.0.0.1:38353
20:13:49.338 INFO clienttrace - src: /127.0.0.1:43034, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741838_1014, duration(ns): 584937
20:13:49.338 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741838_1014, type=LAST_IN_PIPELINE terminating
20:13:49.339 INFO FSNamesystem - BLOCK* blk_1073741838_1014 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai
20:13:49.740 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:49.740 INFO IndexFileMerger - Done merging .bai files
20:13:49.741 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.parts dst=null perm=null proto=rpc
20:13:49.750 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:49.760 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi dst=null perm=null proto=rpc
20:13:49.760 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi dst=null perm=null proto=rpc
20:13:49.761 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi dst=null perm=null proto=rpc
20:13:49.763 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
20:13:49.763 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.764 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.765 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.765 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:49.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:49.768 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:49.773 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:49.774 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:49.774 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi dst=null perm=null proto=rpc
20:13:49.775 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi dst=null perm=null proto=rpc
20:13:49.776 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.sbi dst=null perm=null proto=rpc
20:13:49.777 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
20:13:49.778 INFO MemoryStore - Block broadcast_51 stored as values in memory (estimated size 13.3 KiB, free 1918.0 MiB)
20:13:49.778 INFO MemoryStore - Block broadcast_51_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.0 MiB)
20:13:49.779 INFO BlockManagerInfo - Added broadcast_51_piece0 in memory on localhost:35739 (size: 8.3 KiB, free: 1919.6 MiB)
20:13:49.779 INFO SparkContext - Created broadcast 51 from broadcast at BamSource.java:104
20:13:49.780 INFO MemoryStore - Block broadcast_52 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
20:13:49.791 INFO BlockManagerInfo - Removed broadcast_49_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.8 MiB)
20:13:49.792 INFO BlockManagerInfo - Removed broadcast_48_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:13:49.792 INFO BlockManagerInfo - Removed broadcast_46_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:13:49.793 INFO BlockManagerInfo - Removed broadcast_50_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.9 MiB)
20:13:49.795 INFO BlockManagerInfo - Removed broadcast_47_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.9 MiB)
20:13:49.797 INFO MemoryStore - Block broadcast_52_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:13:49.798 INFO BlockManagerInfo - Added broadcast_52_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:13:49.798 INFO SparkContext - Created broadcast 52 from newAPIHadoopFile at PathSplitSource.java:96
20:13:49.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.811 INFO FileInputFormat - Total input files to process : 1
20:13:49.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.834 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:13:49.834 INFO DAGScheduler - Got job 25 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:13:49.834 INFO DAGScheduler - Final stage: ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182)
20:13:49.834 INFO DAGScheduler - Parents of final stage: List()
20:13:49.834 INFO DAGScheduler - Missing parents: List()
20:13:49.835 INFO DAGScheduler - Submitting ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:49.846 INFO MemoryStore - Block broadcast_53 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
20:13:49.847 INFO MemoryStore - Block broadcast_53_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
20:13:49.847 INFO BlockManagerInfo - Added broadcast_53_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.8 MiB)
20:13:49.847 INFO SparkContext - Created broadcast 53 from broadcast at DAGScheduler.scala:1580
20:13:49.847 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:49.847 INFO TaskSchedulerImpl - Adding task set 36.0 with 1 tasks resource profile 0
20:13:49.848 INFO TaskSetManager - Starting task 0.0 in stage 36.0 (TID 74) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:49.849 INFO Executor - Running task 0.0 in stage 36.0 (TID 74)
20:13:49.868 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam:0+237038
20:13:49.869 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:49.871 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:49.872 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:49.872 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:49.874 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:49.877 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:49.878 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:49.884 INFO Executor - Finished task 0.0 in stage 36.0 (TID 74). 651526 bytes result sent to driver
20:13:49.888 INFO TaskSetManager - Finished task 0.0 in stage 36.0 (TID 74) in 40 ms on localhost (executor driver) (1/1)
20:13:49.888 INFO TaskSchedulerImpl - Removed TaskSet 36.0, whose tasks have all completed, from pool
20:13:49.888 INFO DAGScheduler - ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.053 s
20:13:49.888 INFO DAGScheduler - Job 25 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:49.888 INFO TaskSchedulerImpl - Killing all running tasks in stage 36: Stage finished
20:13:49.888 INFO DAGScheduler - Job 25 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.054429 s
20:13:49.906 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:49.907 INFO DAGScheduler - Got job 26 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:49.907 INFO DAGScheduler - Final stage: ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185)
20:13:49.907 INFO DAGScheduler - Parents of final stage: List()
20:13:49.907 INFO DAGScheduler - Missing parents: List()
20:13:49.907 INFO DAGScheduler - Submitting ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:49.925 INFO MemoryStore - Block broadcast_54 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
20:13:49.927 INFO MemoryStore - Block broadcast_54_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.5 MiB)
20:13:49.927 INFO BlockManagerInfo - Added broadcast_54_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:13:49.927 INFO SparkContext - Created broadcast 54 from broadcast at DAGScheduler.scala:1580
20:13:49.927 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:49.927 INFO TaskSchedulerImpl - Adding task set 37.0 with 1 tasks resource profile 0
20:13:49.928 INFO TaskSetManager - Starting task 0.0 in stage 37.0 (TID 75) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:13:49.929 INFO Executor - Running task 0.0 in stage 37.0 (TID 75)
20:13:49.964 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:49.981 INFO Executor - Finished task 0.0 in stage 37.0 (TID 75). 989 bytes result sent to driver
20:13:49.981 INFO TaskSetManager - Finished task 0.0 in stage 37.0 (TID 75) in 53 ms on localhost (executor driver) (1/1)
20:13:49.981 INFO TaskSchedulerImpl - Removed TaskSet 37.0, whose tasks have all completed, from pool
20:13:49.982 INFO DAGScheduler - ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.075 s
20:13:49.982 INFO DAGScheduler - Job 26 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:49.982 INFO TaskSchedulerImpl - Killing all running tasks in stage 37: Stage finished
20:13:49.982 INFO DAGScheduler - Job 26 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.075748 s
20:13:49.985 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:49.986 INFO DAGScheduler - Got job 27 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:49.986 INFO DAGScheduler - Final stage: ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185)
20:13:49.986 INFO DAGScheduler - Parents of final stage: List()
20:13:49.986 INFO DAGScheduler - Missing parents: List()
20:13:49.986 INFO DAGScheduler - Submitting ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:49.997 INFO MemoryStore - Block broadcast_55 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
20:13:50.004 INFO MemoryStore - Block broadcast_55_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.3 MiB)
20:13:50.004 INFO BlockManagerInfo - Added broadcast_55_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:13:50.004 INFO BlockManagerInfo - Removed broadcast_54_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:13:50.005 INFO SparkContext - Created broadcast 55 from broadcast at DAGScheduler.scala:1580
20:13:50.005 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:50.005 INFO TaskSchedulerImpl - Adding task set 38.0 with 1 tasks resource profile 0
20:13:50.005 INFO BlockManagerInfo - Removed broadcast_53_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.8 MiB)
20:13:50.006 INFO TaskSetManager - Starting task 0.0 in stage 38.0 (TID 76) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:50.006 INFO Executor - Running task 0.0 in stage 38.0 (TID 76)
20:13:50.021 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam:0+237038
20:13:50.023 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:50.024 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam dst=null perm=null proto=rpc
20:13:50.025 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:50.026 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:50.026 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1ddb47b3-3c93-4e09-a205-af9cc1eb614b.bam.bai dst=null perm=null proto=rpc
20:13:50.028 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:50.031 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:50.032 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:50.034 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:50.037 INFO Executor - Finished task 0.0 in stage 38.0 (TID 76). 989 bytes result sent to driver
20:13:50.037 INFO TaskSetManager - Finished task 0.0 in stage 38.0 (TID 76) in 31 ms on localhost (executor driver) (1/1)
20:13:50.037 INFO TaskSchedulerImpl - Removed TaskSet 38.0, whose tasks have all completed, from pool
20:13:50.037 INFO DAGScheduler - ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.050 s
20:13:50.038 INFO DAGScheduler - Job 27 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:50.038 INFO TaskSchedulerImpl - Killing all running tasks in stage 38: Stage finished
20:13:50.038 INFO DAGScheduler - Job 27 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.052206 s
20:13:50.043 INFO MemoryStore - Block broadcast_56 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
20:13:50.050 INFO MemoryStore - Block broadcast_56_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
20:13:50.050 INFO BlockManagerInfo - Added broadcast_56_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:13:50.050 INFO SparkContext - Created broadcast 56 from newAPIHadoopFile at PathSplitSource.java:96
20:13:50.076 INFO MemoryStore - Block broadcast_57 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
20:13:50.083 INFO MemoryStore - Block broadcast_57_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
20:13:50.083 INFO BlockManagerInfo - Added broadcast_57_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:13:50.084 INFO SparkContext - Created broadcast 57 from newAPIHadoopFile at PathSplitSource.java:96
20:13:50.107 INFO FileInputFormat - Total input files to process : 1
20:13:50.110 INFO MemoryStore - Block broadcast_58 stored as values in memory (estimated size 160.7 KiB, free 1918.3 MiB)
20:13:50.118 INFO BlockManagerInfo - Removed broadcast_57_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:13:50.118 INFO MemoryStore - Block broadcast_58_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.6 MiB)
20:13:50.119 INFO BlockManagerInfo - Added broadcast_58_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.8 MiB)
20:13:50.119 INFO BlockManagerInfo - Removed broadcast_55_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.8 MiB)
20:13:50.120 INFO SparkContext - Created broadcast 58 from broadcast at ReadsSparkSink.java:133
20:13:50.120 INFO BlockManagerInfo - Removed broadcast_52_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:13:50.121 INFO BlockManagerInfo - Removed broadcast_45_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:13:50.121 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:13:50.121 INFO BlockManagerInfo - Removed broadcast_51_piece0 on localhost:35739 in memory (size: 8.3 KiB, free: 1919.9 MiB)
20:13:50.122 INFO MemoryStore - Block broadcast_59 stored as values in memory (estimated size 163.2 KiB, free 1919.3 MiB)
20:13:50.124 INFO MemoryStore - Block broadcast_59_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.3 MiB)
20:13:50.124 INFO BlockManagerInfo - Added broadcast_59_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:13:50.124 INFO SparkContext - Created broadcast 59 from broadcast at BamSink.java:76
20:13:50.128 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts dst=null perm=null proto=rpc
20:13:50.129 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:50.129 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:50.129 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:50.130 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:50.138 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:13:50.138 INFO DAGScheduler - Registering RDD 127 (mapToPair at SparkUtils.java:161) as input to shuffle 9
20:13:50.139 INFO DAGScheduler - Got job 28 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:13:50.139 INFO DAGScheduler - Final stage: ResultStage 40 (runJob at SparkHadoopWriter.scala:83)
20:13:50.139 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 39)
20:13:50.139 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 39)
20:13:50.139 INFO DAGScheduler - Submitting ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161), which has no missing parents
20:13:50.157 INFO MemoryStore - Block broadcast_60 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
20:13:50.159 INFO MemoryStore - Block broadcast_60_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
20:13:50.159 INFO BlockManagerInfo - Added broadcast_60_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.8 MiB)
20:13:50.160 INFO SparkContext - Created broadcast 60 from broadcast at DAGScheduler.scala:1580
20:13:50.160 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:13:50.160 INFO TaskSchedulerImpl - Adding task set 39.0 with 1 tasks resource profile 0
20:13:50.161 INFO TaskSetManager - Starting task 0.0 in stage 39.0 (TID 77) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:13:50.161 INFO Executor - Running task 0.0 in stage 39.0 (TID 77)
20:13:50.216 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:50.237 INFO Executor - Finished task 0.0 in stage 39.0 (TID 77). 1148 bytes result sent to driver
20:13:50.237 INFO TaskSetManager - Finished task 0.0 in stage 39.0 (TID 77) in 77 ms on localhost (executor driver) (1/1)
20:13:50.237 INFO TaskSchedulerImpl - Removed TaskSet 39.0, whose tasks have all completed, from pool
20:13:50.238 INFO DAGScheduler - ShuffleMapStage 39 (mapToPair at SparkUtils.java:161) finished in 0.098 s
20:13:50.238 INFO DAGScheduler - looking for newly runnable stages
20:13:50.238 INFO DAGScheduler - running: HashSet()
20:13:50.238 INFO DAGScheduler - waiting: HashSet(ResultStage 40)
20:13:50.238 INFO DAGScheduler - failed: HashSet()
20:13:50.238 INFO DAGScheduler - Submitting ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91), which has no missing parents
20:13:50.245 INFO MemoryStore - Block broadcast_61 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
20:13:50.250 INFO MemoryStore - Block broadcast_61_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
20:13:50.250 INFO BlockManagerInfo - Added broadcast_61_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.7 MiB)
20:13:50.251 INFO SparkContext - Created broadcast 61 from broadcast at DAGScheduler.scala:1580
20:13:50.251 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:13:50.251 INFO TaskSchedulerImpl - Adding task set 40.0 with 1 tasks resource profile 0
20:13:50.252 INFO TaskSetManager - Starting task 0.0 in stage 40.0 (TID 78) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:13:50.252 INFO Executor - Running task 0.0 in stage 40.0 (TID 78)
20:13:50.261 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:13:50.262 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:13:50.283 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:50.283 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:50.283 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:50.283 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:50.283 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:50.283 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:50.285 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:50.286 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:50.289 INFO StateChange - BLOCK* allocate blk_1073741839_1015, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/part-r-00000
20:13:50.291 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741839_1015 src: /127.0.0.1:43056 dest: /127.0.0.1:38353
20:13:50.294 INFO clienttrace - src: /127.0.0.1:43056, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741839_1015, duration(ns): 2617310
20:13:50.294 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741839_1015, type=LAST_IN_PIPELINE terminating
20:13:50.295 INFO FSNamesystem - BLOCK* blk_1073741839_1015 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/part-r-00000
20:13:50.697 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:50.698 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:13:50.700 INFO StateChange - BLOCK* allocate blk_1073741840_1016, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/.part-r-00000.bai
20:13:50.702 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741840_1016 src: /127.0.0.1:43058 dest: /127.0.0.1:38353
20:13:50.703 INFO clienttrace - src: /127.0.0.1:43058, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741840_1016, duration(ns): 679412
20:13:50.703 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741840_1016, type=LAST_IN_PIPELINE terminating
20:13:50.704 INFO FSNamesystem - BLOCK* blk_1073741840_1016 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/.part-r-00000.bai
20:13:51.105 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:51.106 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0 dst=null perm=null proto=rpc
20:13:51.107 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0 dst=null perm=null proto=rpc
20:13:51.108 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/task_202502102013504846711817274723305_0132_r_000000 dst=null perm=null proto=rpc
20:13:51.109 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/_temporary/attempt_202502102013504846711817274723305_0132_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/task_202502102013504846711817274723305_0132_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:51.109 INFO FileOutputCommitter - Saved output of task 'attempt_202502102013504846711817274723305_0132_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/task_202502102013504846711817274723305_0132_r_000000
20:13:51.109 INFO SparkHadoopMapRedUtil - attempt_202502102013504846711817274723305_0132_r_000000_0: Committed. Elapsed time: 2 ms.
20:13:51.110 INFO Executor - Finished task 0.0 in stage 40.0 (TID 78). 1858 bytes result sent to driver
20:13:51.111 INFO TaskSetManager - Finished task 0.0 in stage 40.0 (TID 78) in 859 ms on localhost (executor driver) (1/1)
20:13:51.111 INFO TaskSchedulerImpl - Removed TaskSet 40.0, whose tasks have all completed, from pool
20:13:51.111 INFO DAGScheduler - ResultStage 40 (runJob at SparkHadoopWriter.scala:83) finished in 0.873 s
20:13:51.112 INFO DAGScheduler - Job 28 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:51.112 INFO TaskSchedulerImpl - Killing all running tasks in stage 40: Stage finished
20:13:51.112 INFO DAGScheduler - Job 28 finished: runJob at SparkHadoopWriter.scala:83, took 0.974025 s
20:13:51.113 INFO SparkHadoopWriter - Start to commit write Job job_202502102013504846711817274723305_0132.
20:13:51.113 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:13:51.114 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts dst=null perm=null proto=rpc
20:13:51.114 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/task_202502102013504846711817274723305_0132_r_000000 dst=null perm=null proto=rpc
20:13:51.115 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:51.116 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/task_202502102013504846711817274723305_0132_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.116 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:13:51.117 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary/0/task_202502102013504846711817274723305_0132_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.118 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_temporary dst=null perm=null proto=rpc
20:13:51.118 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.119 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:51.120 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/.spark-staging-132 dst=null perm=null proto=rpc
20:13:51.120 INFO SparkHadoopWriter - Write Job job_202502102013504846711817274723305_0132 committed. Elapsed time: 7 ms.
20:13:51.121 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.123 INFO StateChange - BLOCK* allocate blk_1073741841_1017, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/header
20:13:51.124 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741841_1017 src: /127.0.0.1:43060 dest: /127.0.0.1:38353
20:13:51.125 INFO clienttrace - src: /127.0.0.1:43060, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741841_1017, duration(ns): 521612
20:13:51.125 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741841_1017, type=LAST_IN_PIPELINE terminating
20:13:51.126 INFO FSNamesystem - BLOCK* blk_1073741841_1017 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/header
20:13:51.527 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:51.528 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.529 INFO StateChange - BLOCK* allocate blk_1073741842_1018, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/terminator
20:13:51.531 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741842_1018 src: /127.0.0.1:43068 dest: /127.0.0.1:38353
20:13:51.532 INFO clienttrace - src: /127.0.0.1:43068, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741842_1018, duration(ns): 584910
20:13:51.532 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741842_1018, type=LAST_IN_PIPELINE terminating
20:13:51.533 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:51.534 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts dst=null perm=null proto=rpc
20:13:51.535 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.536 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:51.536 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam
20:13:51.537 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.538 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:51.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.539 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam done
20:13:51.539 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:51.539 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai
20:13:51.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts dst=null perm=null proto=rpc
20:13:51.541 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:51.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:51.543 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:51.545 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:51.545 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:51.548 INFO StateChange - BLOCK* allocate blk_1073741843_1019, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai
20:13:51.549 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741843_1019 src: /127.0.0.1:43070 dest: /127.0.0.1:38353
20:13:51.551 INFO clienttrace - src: /127.0.0.1:43070, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741843_1019, duration(ns): 522478
20:13:51.551 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741843_1019, type=LAST_IN_PIPELINE terminating
20:13:51.552 INFO FSNamesystem - BLOCK* blk_1073741843_1019 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai
20:13:51.953 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:51.953 INFO IndexFileMerger - Done merging .bai files
20:13:51.954 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.parts dst=null perm=null proto=rpc
20:13:51.966 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:51.967 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:51.967 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:51.968 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:51.969 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:51.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:51.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:51.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:51.973 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:51.976 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:51.977 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:51.977 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:51.978 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.sbi dst=null perm=null proto=rpc
20:13:51.980 INFO MemoryStore - Block broadcast_62 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:13:51.991 INFO MemoryStore - Block broadcast_62_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:13:51.992 INFO BlockManagerInfo - Added broadcast_62_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:13:51.992 INFO SparkContext - Created broadcast 62 from newAPIHadoopFile at PathSplitSource.java:96
20:13:52.021 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.021 INFO FileInputFormat - Total input files to process : 1
20:13:52.022 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741833_1009 replica FinalizedReplica, blk_1073741833_1009, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741833 for deletion
20:13:52.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741834_1010 replica FinalizedReplica, blk_1073741834_1010, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741834 for deletion
20:13:52.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741834_1010 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741834
20:13:52.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741833_1009 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741833
20:13:52.061 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:13:52.061 INFO DAGScheduler - Got job 29 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:13:52.061 INFO DAGScheduler - Final stage: ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182)
20:13:52.061 INFO DAGScheduler - Parents of final stage: List()
20:13:52.061 INFO DAGScheduler - Missing parents: List()
20:13:52.061 INFO DAGScheduler - Submitting ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:52.085 INFO MemoryStore - Block broadcast_63 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
20:13:52.087 INFO MemoryStore - Block broadcast_63_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
20:13:52.087 INFO BlockManagerInfo - Added broadcast_63_piece0 in memory on localhost:35739 (size: 153.7 KiB, free: 1919.5 MiB)
20:13:52.087 INFO SparkContext - Created broadcast 63 from broadcast at DAGScheduler.scala:1580
20:13:52.088 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:52.088 INFO TaskSchedulerImpl - Adding task set 41.0 with 1 tasks resource profile 0
20:13:52.089 INFO TaskSetManager - Starting task 0.0 in stage 41.0 (TID 79) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:52.089 INFO Executor - Running task 0.0 in stage 41.0 (TID 79)
20:13:52.126 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam:0+237038
20:13:52.127 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.128 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.131 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:52.131 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.132 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.133 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.134 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.134 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.137 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:52.140 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.140 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.141 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.141 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.142 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.143 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:52.150 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.151 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.152 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.153 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.154 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.154 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.156 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.159 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.160 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.161 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.162 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.163 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.163 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.166 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.168 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.169 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.170 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.171 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.172 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.173 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.173 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.174 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.177 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.178 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.179 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.179 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.181 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.183 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.184 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.185 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.186 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.187 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.188 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.189 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.189 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.190 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.193 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.194 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.195 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.195 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.197 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.200 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.201 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.202 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.203 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.206 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.207 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.208 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.209 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.210 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.210 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.211 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.213 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.214 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.219 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:52.222 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.223 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.225 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.225 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:52.241 INFO BlockManagerInfo - Removed broadcast_61_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.6 MiB)
20:13:52.242 INFO BlockManagerInfo - Removed broadcast_60_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.7 MiB)
20:13:52.243 INFO BlockManagerInfo - Removed broadcast_58_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:13:52.243 INFO Executor - Finished task 0.0 in stage 41.0 (TID 79). 651569 bytes result sent to driver
20:13:52.245 INFO BlockManagerInfo - Removed broadcast_59_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:13:52.248 INFO TaskSetManager - Finished task 0.0 in stage 41.0 (TID 79) in 160 ms on localhost (executor driver) (1/1)
20:13:52.248 INFO TaskSchedulerImpl - Removed TaskSet 41.0, whose tasks have all completed, from pool
20:13:52.249 INFO DAGScheduler - ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.186 s
20:13:52.249 INFO DAGScheduler - Job 29 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:52.249 INFO TaskSchedulerImpl - Killing all running tasks in stage 41: Stage finished
20:13:52.249 INFO DAGScheduler - Job 29 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.188254 s
20:13:52.270 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:52.270 INFO DAGScheduler - Got job 30 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:52.270 INFO DAGScheduler - Final stage: ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185)
20:13:52.270 INFO DAGScheduler - Parents of final stage: List()
20:13:52.271 INFO DAGScheduler - Missing parents: List()
20:13:52.271 INFO DAGScheduler - Submitting ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:52.289 INFO MemoryStore - Block broadcast_64 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
20:13:52.290 INFO MemoryStore - Block broadcast_64_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
20:13:52.291 INFO BlockManagerInfo - Added broadcast_64_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.6 MiB)
20:13:52.291 INFO SparkContext - Created broadcast 64 from broadcast at DAGScheduler.scala:1580
20:13:52.291 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:52.291 INFO TaskSchedulerImpl - Adding task set 42.0 with 1 tasks resource profile 0
20:13:52.292 INFO TaskSetManager - Starting task 0.0 in stage 42.0 (TID 80) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:13:52.292 INFO Executor - Running task 0.0 in stage 42.0 (TID 80)
20:13:52.329 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:52.346 INFO Executor - Finished task 0.0 in stage 42.0 (TID 80). 989 bytes result sent to driver
20:13:52.346 INFO TaskSetManager - Finished task 0.0 in stage 42.0 (TID 80) in 54 ms on localhost (executor driver) (1/1)
20:13:52.346 INFO TaskSchedulerImpl - Removed TaskSet 42.0, whose tasks have all completed, from pool
20:13:52.347 INFO DAGScheduler - ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.076 s
20:13:52.347 INFO DAGScheduler - Job 30 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:52.347 INFO TaskSchedulerImpl - Killing all running tasks in stage 42: Stage finished
20:13:52.347 INFO DAGScheduler - Job 30 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.077009 s
20:13:52.353 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:52.354 INFO DAGScheduler - Got job 31 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:52.354 INFO DAGScheduler - Final stage: ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185)
20:13:52.354 INFO DAGScheduler - Parents of final stage: List()
20:13:52.354 INFO DAGScheduler - Missing parents: List()
20:13:52.354 INFO DAGScheduler - Submitting ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:52.374 INFO MemoryStore - Block broadcast_65 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:13:52.375 INFO MemoryStore - Block broadcast_65_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:13:52.376 INFO BlockManagerInfo - Added broadcast_65_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:13:52.376 INFO SparkContext - Created broadcast 65 from broadcast at DAGScheduler.scala:1580
20:13:52.376 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:52.376 INFO TaskSchedulerImpl - Adding task set 43.0 with 1 tasks resource profile 0
20:13:52.377 INFO TaskSetManager - Starting task 0.0 in stage 43.0 (TID 81) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:52.377 INFO Executor - Running task 0.0 in stage 43.0 (TID 81)
20:13:52.425 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam:0+237038
20:13:52.426 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.427 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.429 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:52.429 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.430 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.431 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.431 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.432 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.434 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:52.436 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.436 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.437 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.437 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.438 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.443 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.444 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.445 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.446 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.447 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.449 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.449 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.450 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.451 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.454 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.455 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.456 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.456 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.459 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.461 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.462 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.463 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.463 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.466 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.467 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.468 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.468 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.470 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.471 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.472 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.473 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.474 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.475 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.476 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.477 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.478 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.478 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.479 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.480 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.481 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.482 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.483 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.484 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.485 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.486 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.487 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.487 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.488 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.489 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.490 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.490 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.491 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.492 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.493 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.493 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.494 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.497 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.498 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.498 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.499 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:52.502 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.502 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam dst=null perm=null proto=rpc
20:13:52.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.506 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5e127439-4ed4-43bf-be4d-2746bebb2e84.bam.bai dst=null perm=null proto=rpc
20:13:52.509 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:52.511 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.511 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:52.513 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:52.514 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:52.516 INFO Executor - Finished task 0.0 in stage 43.0 (TID 81). 989 bytes result sent to driver
20:13:52.517 INFO TaskSetManager - Finished task 0.0 in stage 43.0 (TID 81) in 140 ms on localhost (executor driver) (1/1)
20:13:52.517 INFO TaskSchedulerImpl - Removed TaskSet 43.0, whose tasks have all completed, from pool
20:13:52.517 INFO DAGScheduler - ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.163 s
20:13:52.517 INFO DAGScheduler - Job 31 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:52.517 INFO TaskSchedulerImpl - Killing all running tasks in stage 43: Stage finished
20:13:52.518 INFO DAGScheduler - Job 31 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.164447 s
20:13:52.522 INFO MemoryStore - Block broadcast_66 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
20:13:52.529 INFO MemoryStore - Block broadcast_66_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
20:13:52.529 INFO BlockManagerInfo - Added broadcast_66_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:13:52.530 INFO SparkContext - Created broadcast 66 from newAPIHadoopFile at PathSplitSource.java:96
20:13:52.555 INFO MemoryStore - Block broadcast_67 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
20:13:52.561 INFO MemoryStore - Block broadcast_67_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
20:13:52.562 INFO BlockManagerInfo - Added broadcast_67_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:13:52.562 INFO SparkContext - Created broadcast 67 from newAPIHadoopFile at PathSplitSource.java:96
20:13:52.586 INFO FileInputFormat - Total input files to process : 1
20:13:52.588 INFO MemoryStore - Block broadcast_68 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:13:52.590 INFO MemoryStore - Block broadcast_68_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:13:52.590 INFO BlockManagerInfo - Added broadcast_68_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:13:52.591 INFO SparkContext - Created broadcast 68 from broadcast at ReadsSparkSink.java:133
20:13:52.591 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:13:52.592 INFO MemoryStore - Block broadcast_69 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
20:13:52.594 INFO MemoryStore - Block broadcast_69_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
20:13:52.594 INFO BlockManagerInfo - Added broadcast_69_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:13:52.594 INFO SparkContext - Created broadcast 69 from broadcast at BamSink.java:76
20:13:52.597 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts dst=null perm=null proto=rpc
20:13:52.597 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:52.597 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:52.597 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:52.598 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:52.604 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:13:52.605 INFO DAGScheduler - Registering RDD 153 (mapToPair at SparkUtils.java:161) as input to shuffle 10
20:13:52.605 INFO DAGScheduler - Got job 32 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:13:52.605 INFO DAGScheduler - Final stage: ResultStage 45 (runJob at SparkHadoopWriter.scala:83)
20:13:52.605 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 44)
20:13:52.606 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 44)
20:13:52.606 INFO DAGScheduler - Submitting ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161), which has no missing parents
20:13:52.634 INFO MemoryStore - Block broadcast_70 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
20:13:52.635 INFO MemoryStore - Block broadcast_70_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
20:13:52.636 INFO BlockManagerInfo - Added broadcast_70_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.2 MiB)
20:13:52.636 INFO SparkContext - Created broadcast 70 from broadcast at DAGScheduler.scala:1580
20:13:52.636 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:13:52.636 INFO TaskSchedulerImpl - Adding task set 44.0 with 1 tasks resource profile 0
20:13:52.637 INFO TaskSetManager - Starting task 0.0 in stage 44.0 (TID 82) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:13:52.638 INFO Executor - Running task 0.0 in stage 44.0 (TID 82)
20:13:52.670 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:52.688 INFO Executor - Finished task 0.0 in stage 44.0 (TID 82). 1148 bytes result sent to driver
20:13:52.689 INFO TaskSetManager - Finished task 0.0 in stage 44.0 (TID 82) in 52 ms on localhost (executor driver) (1/1)
20:13:52.689 INFO DAGScheduler - ShuffleMapStage 44 (mapToPair at SparkUtils.java:161) finished in 0.083 s
20:13:52.689 INFO DAGScheduler - looking for newly runnable stages
20:13:52.689 INFO DAGScheduler - running: HashSet()
20:13:52.690 INFO DAGScheduler - waiting: HashSet(ResultStage 45)
20:13:52.690 INFO DAGScheduler - failed: HashSet()
20:13:52.690 INFO TaskSchedulerImpl - Removed TaskSet 44.0, whose tasks have all completed, from pool
20:13:52.690 INFO DAGScheduler - Submitting ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91), which has no missing parents
20:13:52.701 INFO MemoryStore - Block broadcast_71 stored as values in memory (estimated size 241.5 KiB, free 1915.7 MiB)
20:13:52.702 INFO MemoryStore - Block broadcast_71_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.6 MiB)
20:13:52.703 INFO BlockManagerInfo - Added broadcast_71_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.1 MiB)
20:13:52.703 INFO SparkContext - Created broadcast 71 from broadcast at DAGScheduler.scala:1580
20:13:52.703 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:13:52.703 INFO TaskSchedulerImpl - Adding task set 45.0 with 1 tasks resource profile 0
20:13:52.704 INFO TaskSetManager - Starting task 0.0 in stage 45.0 (TID 83) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:13:52.704 INFO Executor - Running task 0.0 in stage 45.0 (TID 83)
20:13:52.712 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:13:52.712 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:13:52.733 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:52.733 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:52.733 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:52.733 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:52.733 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:52.733 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:52.735 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:52.736 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:52.740 INFO StateChange - BLOCK* allocate blk_1073741844_1020, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/part-r-00000
20:13:52.741 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741844_1020 src: /127.0.0.1:43752 dest: /127.0.0.1:38353
20:13:52.745 INFO clienttrace - src: /127.0.0.1:43752, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741844_1020, duration(ns): 2897109
20:13:52.745 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741844_1020, type=LAST_IN_PIPELINE terminating
20:13:52.745 INFO FSNamesystem - BLOCK* blk_1073741844_1020 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/part-r-00000
20:13:53.146 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:53.147 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:13:53.148 INFO StateChange - BLOCK* allocate blk_1073741845_1021, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/.part-r-00000.sbi
20:13:53.149 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741845_1021 src: /127.0.0.1:43758 dest: /127.0.0.1:38353
20:13:53.151 INFO clienttrace - src: /127.0.0.1:43758, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741845_1021, duration(ns): 488715
20:13:53.151 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741845_1021, type=LAST_IN_PIPELINE terminating
20:13:53.152 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:53.153 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0 dst=null perm=null proto=rpc
20:13:53.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0 dst=null perm=null proto=rpc
20:13:53.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/task_202502102013526926473160357730885_0158_r_000000 dst=null perm=null proto=rpc
20:13:53.155 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/_temporary/attempt_202502102013526926473160357730885_0158_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/task_202502102013526926473160357730885_0158_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:53.155 INFO FileOutputCommitter - Saved output of task 'attempt_202502102013526926473160357730885_0158_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/task_202502102013526926473160357730885_0158_r_000000
20:13:53.155 INFO SparkHadoopMapRedUtil - attempt_202502102013526926473160357730885_0158_r_000000_0: Committed. Elapsed time: 2 ms.
20:13:53.156 INFO Executor - Finished task 0.0 in stage 45.0 (TID 83). 1858 bytes result sent to driver
20:13:53.157 INFO TaskSetManager - Finished task 0.0 in stage 45.0 (TID 83) in 453 ms on localhost (executor driver) (1/1)
20:13:53.157 INFO TaskSchedulerImpl - Removed TaskSet 45.0, whose tasks have all completed, from pool
20:13:53.157 INFO DAGScheduler - ResultStage 45 (runJob at SparkHadoopWriter.scala:83) finished in 0.467 s
20:13:53.157 INFO DAGScheduler - Job 32 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:53.157 INFO TaskSchedulerImpl - Killing all running tasks in stage 45: Stage finished
20:13:53.157 INFO DAGScheduler - Job 32 finished: runJob at SparkHadoopWriter.scala:83, took 0.552980 s
20:13:53.158 INFO SparkHadoopWriter - Start to commit write Job job_202502102013526926473160357730885_0158.
20:13:53.159 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:13:53.159 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts dst=null perm=null proto=rpc
20:13:53.160 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/task_202502102013526926473160357730885_0158_r_000000 dst=null perm=null proto=rpc
20:13:53.161 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:53.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/task_202502102013526926473160357730885_0158_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:13:53.163 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary/0/task_202502102013526926473160357730885_0158_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.163 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_temporary dst=null perm=null proto=rpc
20:13:53.164 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.165 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:53.166 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/.spark-staging-158 dst=null perm=null proto=rpc
20:13:53.166 INFO SparkHadoopWriter - Write Job job_202502102013526926473160357730885_0158 committed. Elapsed time: 7 ms.
20:13:53.167 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.168 INFO StateChange - BLOCK* allocate blk_1073741846_1022, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/header
20:13:53.169 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741846_1022 src: /127.0.0.1:43760 dest: /127.0.0.1:38353
20:13:53.171 INFO clienttrace - src: /127.0.0.1:43760, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741846_1022, duration(ns): 531886
20:13:53.171 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741846_1022, type=LAST_IN_PIPELINE terminating
20:13:53.171 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:53.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.173 INFO StateChange - BLOCK* allocate blk_1073741847_1023, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/terminator
20:13:53.174 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741847_1023 src: /127.0.0.1:43762 dest: /127.0.0.1:38353
20:13:53.175 INFO clienttrace - src: /127.0.0.1:43762, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741847_1023, duration(ns): 405740
20:13:53.175 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741847_1023, type=LAST_IN_PIPELINE terminating
20:13:53.176 INFO FSNamesystem - BLOCK* blk_1073741847_1023 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/terminator
20:13:53.577 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:53.578 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts dst=null perm=null proto=rpc
20:13:53.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.580 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:53.581 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam
20:13:53.581 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.582 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.583 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.583 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam done
20:13:53.583 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.583 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi
20:13:53.584 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts dst=null perm=null proto=rpc
20:13:53.585 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:53.586 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:53.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:53.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:53.590 INFO StateChange - BLOCK* allocate blk_1073741848_1024, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi
20:13:53.590 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741848_1024 src: /127.0.0.1:43770 dest: /127.0.0.1:38353
20:13:53.592 INFO clienttrace - src: /127.0.0.1:43770, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741848_1024, duration(ns): 443243
20:13:53.592 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741848_1024, type=LAST_IN_PIPELINE terminating
20:13:53.593 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:53.593 INFO IndexFileMerger - Done merging .sbi files
20:13:53.594 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.parts dst=null perm=null proto=rpc
20:13:53.604 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi dst=null perm=null proto=rpc
20:13:53.604 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi dst=null perm=null proto=rpc
20:13:53.605 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi dst=null perm=null proto=rpc
20:13:53.606 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.607 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.607 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.608 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.609 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.bai dst=null perm=null proto=rpc
20:13:53.609 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bai dst=null perm=null proto=rpc
20:13:53.611 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:53.612 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:53.612 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi dst=null perm=null proto=rpc
20:13:53.613 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi dst=null perm=null proto=rpc
20:13:53.613 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.sbi dst=null perm=null proto=rpc
20:13:53.614 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:13:53.615 INFO MemoryStore - Block broadcast_72 stored as values in memory (estimated size 320.0 B, free 1915.6 MiB)
20:13:53.615 INFO MemoryStore - Block broadcast_72_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.6 MiB)
20:13:53.616 INFO BlockManagerInfo - Added broadcast_72_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.1 MiB)
20:13:53.616 INFO SparkContext - Created broadcast 72 from broadcast at BamSource.java:104
20:13:53.617 INFO MemoryStore - Block broadcast_73 stored as values in memory (estimated size 297.9 KiB, free 1915.3 MiB)
20:13:53.624 INFO MemoryStore - Block broadcast_73_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
20:13:53.624 INFO BlockManagerInfo - Added broadcast_73_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:13:53.624 INFO SparkContext - Created broadcast 73 from newAPIHadoopFile at PathSplitSource.java:96
20:13:53.634 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.634 INFO FileInputFormat - Total input files to process : 1
20:13:53.635 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.650 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:13:53.651 INFO DAGScheduler - Got job 33 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:13:53.651 INFO DAGScheduler - Final stage: ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182)
20:13:53.651 INFO DAGScheduler - Parents of final stage: List()
20:13:53.651 INFO DAGScheduler - Missing parents: List()
20:13:53.651 INFO DAGScheduler - Submitting ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:53.657 INFO MemoryStore - Block broadcast_74 stored as values in memory (estimated size 148.2 KiB, free 1915.1 MiB)
20:13:53.664 INFO MemoryStore - Block broadcast_74_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.1 MiB)
20:13:53.664 INFO BlockManagerInfo - Removed broadcast_67_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:13:53.665 INFO BlockManagerInfo - Added broadcast_74_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.1 MiB)
20:13:53.665 INFO SparkContext - Created broadcast 74 from broadcast at DAGScheduler.scala:1580
20:13:53.666 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:53.666 INFO TaskSchedulerImpl - Adding task set 46.0 with 1 tasks resource profile 0
20:13:53.666 INFO BlockManagerInfo - Removed broadcast_62_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:13:53.667 INFO BlockManagerInfo - Removed broadcast_71_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.2 MiB)
20:13:53.667 INFO TaskSetManager - Starting task 0.0 in stage 46.0 (TID 84) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:53.668 INFO BlockManagerInfo - Removed broadcast_56_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.2 MiB)
20:13:53.668 INFO Executor - Running task 0.0 in stage 46.0 (TID 84)
20:13:53.668 INFO BlockManagerInfo - Removed broadcast_68_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.2 MiB)
20:13:53.669 INFO BlockManagerInfo - Removed broadcast_64_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.4 MiB)
20:13:53.669 INFO BlockManagerInfo - Removed broadcast_63_piece0 on localhost:35739 in memory (size: 153.7 KiB, free: 1919.5 MiB)
20:13:53.670 INFO BlockManagerInfo - Removed broadcast_65_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:13:53.671 INFO BlockManagerInfo - Removed broadcast_70_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.8 MiB)
20:13:53.672 INFO BlockManagerInfo - Removed broadcast_69_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:13:53.686 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam:0+237038
20:13:53.688 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.688 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.690 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.bai dst=null perm=null proto=rpc
20:13:53.690 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bai dst=null perm=null proto=rpc
20:13:53.692 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:53.695 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:53.699 INFO Executor - Finished task 0.0 in stage 46.0 (TID 84). 651526 bytes result sent to driver
20:13:53.702 INFO TaskSetManager - Finished task 0.0 in stage 46.0 (TID 84) in 35 ms on localhost (executor driver) (1/1)
20:13:53.702 INFO DAGScheduler - ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.051 s
20:13:53.703 INFO DAGScheduler - Job 33 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:53.703 INFO TaskSchedulerImpl - Removed TaskSet 46.0, whose tasks have all completed, from pool
20:13:53.703 INFO TaskSchedulerImpl - Killing all running tasks in stage 46: Stage finished
20:13:53.703 INFO DAGScheduler - Job 33 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.052466 s
20:13:53.720 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:53.721 INFO DAGScheduler - Got job 34 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:53.721 INFO DAGScheduler - Final stage: ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185)
20:13:53.721 INFO DAGScheduler - Parents of final stage: List()
20:13:53.721 INFO DAGScheduler - Missing parents: List()
20:13:53.721 INFO DAGScheduler - Submitting ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:53.738 INFO MemoryStore - Block broadcast_75 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
20:13:53.739 INFO MemoryStore - Block broadcast_75_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
20:13:53.740 INFO BlockManagerInfo - Added broadcast_75_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:13:53.740 INFO SparkContext - Created broadcast 75 from broadcast at DAGScheduler.scala:1580
20:13:53.740 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:53.740 INFO TaskSchedulerImpl - Adding task set 47.0 with 1 tasks resource profile 0
20:13:53.741 INFO TaskSetManager - Starting task 0.0 in stage 47.0 (TID 85) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:13:53.741 INFO Executor - Running task 0.0 in stage 47.0 (TID 85)
20:13:53.781 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:53.793 INFO Executor - Finished task 0.0 in stage 47.0 (TID 85). 989 bytes result sent to driver
20:13:53.793 INFO TaskSetManager - Finished task 0.0 in stage 47.0 (TID 85) in 52 ms on localhost (executor driver) (1/1)
20:13:53.793 INFO TaskSchedulerImpl - Removed TaskSet 47.0, whose tasks have all completed, from pool
20:13:53.794 INFO DAGScheduler - ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.073 s
20:13:53.794 INFO DAGScheduler - Job 34 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:53.794 INFO TaskSchedulerImpl - Killing all running tasks in stage 47: Stage finished
20:13:53.794 INFO DAGScheduler - Job 34 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.073752 s
20:13:53.799 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:53.799 INFO DAGScheduler - Got job 35 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:53.799 INFO DAGScheduler - Final stage: ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185)
20:13:53.799 INFO DAGScheduler - Parents of final stage: List()
20:13:53.799 INFO DAGScheduler - Missing parents: List()
20:13:53.799 INFO DAGScheduler - Submitting ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:53.806 INFO MemoryStore - Block broadcast_76 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
20:13:53.807 INFO MemoryStore - Block broadcast_76_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
20:13:53.807 INFO BlockManagerInfo - Added broadcast_76_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:13:53.807 INFO SparkContext - Created broadcast 76 from broadcast at DAGScheduler.scala:1580
20:13:53.808 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:53.808 INFO TaskSchedulerImpl - Adding task set 48.0 with 1 tasks resource profile 0
20:13:53.809 INFO TaskSetManager - Starting task 0.0 in stage 48.0 (TID 86) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:53.809 INFO Executor - Running task 0.0 in stage 48.0 (TID 86)
20:13:53.822 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam:0+237038
20:13:53.823 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.824 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam dst=null perm=null proto=rpc
20:13:53.825 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bam.bai dst=null perm=null proto=rpc
20:13:53.825 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_eaf9929a-40c0-40b2-a479-44a39c39c088.bai dst=null perm=null proto=rpc
20:13:53.827 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:53.829 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:53.829 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:53.831 INFO Executor - Finished task 0.0 in stage 48.0 (TID 86). 989 bytes result sent to driver
20:13:53.831 INFO TaskSetManager - Finished task 0.0 in stage 48.0 (TID 86) in 23 ms on localhost (executor driver) (1/1)
20:13:53.831 INFO TaskSchedulerImpl - Removed TaskSet 48.0, whose tasks have all completed, from pool
20:13:53.832 INFO DAGScheduler - ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.032 s
20:13:53.832 INFO DAGScheduler - Job 35 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:53.832 INFO TaskSchedulerImpl - Killing all running tasks in stage 48: Stage finished
20:13:53.832 INFO DAGScheduler - Job 35 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.033593 s
20:13:53.836 INFO MemoryStore - Block broadcast_77 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:13:53.842 INFO MemoryStore - Block broadcast_77_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:13:53.842 INFO BlockManagerInfo - Added broadcast_77_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:13:53.843 INFO SparkContext - Created broadcast 77 from newAPIHadoopFile at PathSplitSource.java:96
20:13:53.866 INFO MemoryStore - Block broadcast_78 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
20:13:53.872 INFO MemoryStore - Block broadcast_78_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
20:13:53.873 INFO BlockManagerInfo - Added broadcast_78_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:13:53.873 INFO SparkContext - Created broadcast 78 from newAPIHadoopFile at PathSplitSource.java:96
20:13:53.895 INFO FileInputFormat - Total input files to process : 1
20:13:53.897 INFO MemoryStore - Block broadcast_79 stored as values in memory (estimated size 160.7 KiB, free 1917.5 MiB)
20:13:53.899 INFO MemoryStore - Block broadcast_79_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
20:13:53.899 INFO BlockManagerInfo - Added broadcast_79_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:13:53.899 INFO SparkContext - Created broadcast 79 from broadcast at ReadsSparkSink.java:133
20:13:53.900 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:13:53.900 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:13:53.901 INFO MemoryStore - Block broadcast_80 stored as values in memory (estimated size 163.2 KiB, free 1917.4 MiB)
20:13:53.902 INFO MemoryStore - Block broadcast_80_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
20:13:53.902 INFO BlockManagerInfo - Added broadcast_80_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:13:53.903 INFO SparkContext - Created broadcast 80 from broadcast at BamSink.java:76
20:13:53.905 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts dst=null perm=null proto=rpc
20:13:53.905 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:53.905 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:53.905 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:53.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:53.913 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:13:53.913 INFO DAGScheduler - Registering RDD 178 (mapToPair at SparkUtils.java:161) as input to shuffle 11
20:13:53.914 INFO DAGScheduler - Got job 36 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:13:53.914 INFO DAGScheduler - Final stage: ResultStage 50 (runJob at SparkHadoopWriter.scala:83)
20:13:53.914 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 49)
20:13:53.914 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 49)
20:13:53.914 INFO DAGScheduler - Submitting ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161), which has no missing parents
20:13:53.932 INFO MemoryStore - Block broadcast_81 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
20:13:53.934 INFO MemoryStore - Block broadcast_81_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
20:13:53.934 INFO BlockManagerInfo - Added broadcast_81_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.4 MiB)
20:13:53.934 INFO SparkContext - Created broadcast 81 from broadcast at DAGScheduler.scala:1580
20:13:53.934 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:13:53.934 INFO TaskSchedulerImpl - Adding task set 49.0 with 1 tasks resource profile 0
20:13:53.935 INFO TaskSetManager - Starting task 0.0 in stage 49.0 (TID 87) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:13:53.936 INFO Executor - Running task 0.0 in stage 49.0 (TID 87)
20:13:53.968 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:53.988 INFO Executor - Finished task 0.0 in stage 49.0 (TID 87). 1148 bytes result sent to driver
20:13:53.988 INFO TaskSetManager - Finished task 0.0 in stage 49.0 (TID 87) in 53 ms on localhost (executor driver) (1/1)
20:13:53.988 INFO TaskSchedulerImpl - Removed TaskSet 49.0, whose tasks have all completed, from pool
20:13:53.989 INFO DAGScheduler - ShuffleMapStage 49 (mapToPair at SparkUtils.java:161) finished in 0.075 s
20:13:53.989 INFO DAGScheduler - looking for newly runnable stages
20:13:53.989 INFO DAGScheduler - running: HashSet()
20:13:53.989 INFO DAGScheduler - waiting: HashSet(ResultStage 50)
20:13:53.989 INFO DAGScheduler - failed: HashSet()
20:13:53.989 INFO DAGScheduler - Submitting ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91), which has no missing parents
20:13:53.996 INFO MemoryStore - Block broadcast_82 stored as values in memory (estimated size 241.5 KiB, free 1916.4 MiB)
20:13:53.997 INFO MemoryStore - Block broadcast_82_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.4 MiB)
20:13:53.997 INFO BlockManagerInfo - Added broadcast_82_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.3 MiB)
20:13:53.997 INFO SparkContext - Created broadcast 82 from broadcast at DAGScheduler.scala:1580
20:13:53.997 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:13:53.997 INFO TaskSchedulerImpl - Adding task set 50.0 with 1 tasks resource profile 0
20:13:53.998 INFO TaskSetManager - Starting task 0.0 in stage 50.0 (TID 88) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:13:53.998 INFO Executor - Running task 0.0 in stage 50.0 (TID 88)
20:13:54.003 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:13:54.003 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:13:54.018 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:54.018 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:54.018 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:54.018 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:54.019 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:54.019 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:54.020 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/_temporary/attempt_202502102013536491088138182342854_0183_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.023 INFO StateChange - BLOCK* allocate blk_1073741849_1025, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/_temporary/attempt_202502102013536491088138182342854_0183_r_000000_0/part-r-00000
20:13:54.024 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741849_1025 src: /127.0.0.1:43776 dest: /127.0.0.1:38353
20:13:54.027 INFO clienttrace - src: /127.0.0.1:43776, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741849_1025, duration(ns): 2575097
20:13:54.027 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741849_1025, type=LAST_IN_PIPELINE terminating
20:13:54.028 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/_temporary/attempt_202502102013536491088138182342854_0183_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:54.029 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/_temporary/attempt_202502102013536491088138182342854_0183_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:13:54.030 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/_temporary/attempt_202502102013536491088138182342854_0183_r_000000_0 dst=null perm=null proto=rpc
20:13:54.030 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/_temporary/attempt_202502102013536491088138182342854_0183_r_000000_0 dst=null perm=null proto=rpc
20:13:54.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/task_202502102013536491088138182342854_0183_r_000000 dst=null perm=null proto=rpc
20:13:54.032 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/_temporary/attempt_202502102013536491088138182342854_0183_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/task_202502102013536491088138182342854_0183_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:54.032 INFO FileOutputCommitter - Saved output of task 'attempt_202502102013536491088138182342854_0183_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/task_202502102013536491088138182342854_0183_r_000000
20:13:54.032 INFO SparkHadoopMapRedUtil - attempt_202502102013536491088138182342854_0183_r_000000_0: Committed. Elapsed time: 1 ms.
20:13:54.033 INFO Executor - Finished task 0.0 in stage 50.0 (TID 88). 1858 bytes result sent to driver
20:13:54.034 INFO TaskSetManager - Finished task 0.0 in stage 50.0 (TID 88) in 36 ms on localhost (executor driver) (1/1)
20:13:54.034 INFO TaskSchedulerImpl - Removed TaskSet 50.0, whose tasks have all completed, from pool
20:13:54.034 INFO DAGScheduler - ResultStage 50 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
20:13:54.034 INFO DAGScheduler - Job 36 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:54.035 INFO TaskSchedulerImpl - Killing all running tasks in stage 50: Stage finished
20:13:54.035 INFO DAGScheduler - Job 36 finished: runJob at SparkHadoopWriter.scala:83, took 0.121871 s
20:13:54.036 INFO SparkHadoopWriter - Start to commit write Job job_202502102013536491088138182342854_0183.
20:13:54.036 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:13:54.037 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts dst=null perm=null proto=rpc
20:13:54.037 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/task_202502102013536491088138182342854_0183_r_000000 dst=null perm=null proto=rpc
20:13:54.038 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:13:54.039 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary/0/task_202502102013536491088138182342854_0183_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.039 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_temporary dst=null perm=null proto=rpc
20:13:54.040 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.041 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:54.042 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/.spark-staging-183 dst=null perm=null proto=rpc
20:13:54.042 INFO SparkHadoopWriter - Write Job job_202502102013536491088138182342854_0183 committed. Elapsed time: 6 ms.
20:13:54.042 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.044 INFO StateChange - BLOCK* allocate blk_1073741850_1026, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/header
20:13:54.045 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741850_1026 src: /127.0.0.1:43782 dest: /127.0.0.1:38353
20:13:54.046 INFO clienttrace - src: /127.0.0.1:43782, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741850_1026, duration(ns): 457329
20:13:54.046 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741850_1026, type=LAST_IN_PIPELINE terminating
20:13:54.047 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:54.048 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.049 INFO StateChange - BLOCK* allocate blk_1073741851_1027, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/terminator
20:13:54.050 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741851_1027 src: /127.0.0.1:43794 dest: /127.0.0.1:38353
20:13:54.051 INFO clienttrace - src: /127.0.0.1:43794, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741851_1027, duration(ns): 386524
20:13:54.051 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741851_1027, type=LAST_IN_PIPELINE terminating
20:13:54.051 INFO FSNamesystem - BLOCK* blk_1073741851_1027 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/terminator
20:13:54.452 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:54.453 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts dst=null perm=null proto=rpc
20:13:54.455 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.455 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:54.456 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam
20:13:54.456 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.457 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.458 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:54.458 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam done
20:13:54.458 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.459 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.parts dst=null perm=null proto=rpc
20:13:54.460 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.460 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.461 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.461 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.462 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.bai dst=null perm=null proto=rpc
20:13:54.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bai dst=null perm=null proto=rpc
20:13:54.464 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.465 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.sbi dst=null perm=null proto=rpc
20:13:54.467 INFO MemoryStore - Block broadcast_83 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
20:13:54.475 INFO BlockManagerInfo - Removed broadcast_75_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.5 MiB)
20:13:54.476 INFO BlockManagerInfo - Removed broadcast_76_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.5 MiB)
20:13:54.477 INFO BlockManagerInfo - Removed broadcast_78_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:13:54.478 INFO BlockManagerInfo - Removed broadcast_82_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.6 MiB)
20:13:54.479 INFO BlockManagerInfo - Removed broadcast_80_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.6 MiB)
20:13:54.480 INFO BlockManagerInfo - Removed broadcast_81_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.8 MiB)
20:13:54.481 INFO BlockManagerInfo - Removed broadcast_74_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.8 MiB)
20:13:54.482 INFO BlockManagerInfo - Removed broadcast_66_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:13:54.482 INFO MemoryStore - Block broadcast_83_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
20:13:54.482 INFO BlockManagerInfo - Added broadcast_83_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:13:54.483 INFO SparkContext - Created broadcast 83 from newAPIHadoopFile at PathSplitSource.java:96
20:13:54.483 INFO BlockManagerInfo - Removed broadcast_73_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:13:54.484 INFO BlockManagerInfo - Removed broadcast_79_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.9 MiB)
20:13:54.484 INFO BlockManagerInfo - Removed broadcast_72_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.9 MiB)
20:13:54.510 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.510 INFO FileInputFormat - Total input files to process : 1
20:13:54.511 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.560 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:13:54.560 INFO DAGScheduler - Got job 37 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:13:54.560 INFO DAGScheduler - Final stage: ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182)
20:13:54.560 INFO DAGScheduler - Parents of final stage: List()
20:13:54.560 INFO DAGScheduler - Missing parents: List()
20:13:54.560 INFO DAGScheduler - Submitting ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:54.577 INFO MemoryStore - Block broadcast_84 stored as values in memory (estimated size 426.2 KiB, free 1918.9 MiB)
20:13:54.579 INFO MemoryStore - Block broadcast_84_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1918.8 MiB)
20:13:54.579 INFO BlockManagerInfo - Added broadcast_84_piece0 in memory on localhost:35739 (size: 153.7 KiB, free: 1919.8 MiB)
20:13:54.579 INFO SparkContext - Created broadcast 84 from broadcast at DAGScheduler.scala:1580
20:13:54.579 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:54.580 INFO TaskSchedulerImpl - Adding task set 51.0 with 1 tasks resource profile 0
20:13:54.580 INFO TaskSetManager - Starting task 0.0 in stage 51.0 (TID 89) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:54.581 INFO Executor - Running task 0.0 in stage 51.0 (TID 89)
20:13:54.613 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam:0+237038
20:13:54.614 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.615 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.617 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.619 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.bai dst=null perm=null proto=rpc
20:13:54.620 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bai dst=null perm=null proto=rpc
20:13:54.622 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.623 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.624 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.625 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.625 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.632 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.633 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.634 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.634 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.635 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.638 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.639 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.640 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.640 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.642 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.643 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.644 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.645 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.646 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.646 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.647 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.648 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.648 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.649 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.652 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.652 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.654 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.655 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.656 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.656 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.657 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.659 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.660 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.661 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.662 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.662 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.665 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.666 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.666 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.667 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.668 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.669 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.669 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.670 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.671 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.671 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.673 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.673 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.674 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.675 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.675 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.677 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.678 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.679 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.679 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.680 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.681 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.681 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.683 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.684 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.684 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.684 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.685 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.686 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.bai dst=null perm=null proto=rpc
20:13:54.687 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bai dst=null perm=null proto=rpc
20:13:54.688 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.692 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:54.694 INFO Executor - Finished task 0.0 in stage 51.0 (TID 89). 651526 bytes result sent to driver
20:13:54.696 INFO TaskSetManager - Finished task 0.0 in stage 51.0 (TID 89) in 116 ms on localhost (executor driver) (1/1)
20:13:54.696 INFO TaskSchedulerImpl - Removed TaskSet 51.0, whose tasks have all completed, from pool
20:13:54.697 INFO DAGScheduler - ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.136 s
20:13:54.697 INFO DAGScheduler - Job 37 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:54.697 INFO TaskSchedulerImpl - Killing all running tasks in stage 51: Stage finished
20:13:54.697 INFO DAGScheduler - Job 37 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.137195 s
20:13:54.713 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:54.713 INFO DAGScheduler - Got job 38 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:54.714 INFO DAGScheduler - Final stage: ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185)
20:13:54.714 INFO DAGScheduler - Parents of final stage: List()
20:13:54.714 INFO DAGScheduler - Missing parents: List()
20:13:54.714 INFO DAGScheduler - Submitting ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:54.737 INFO MemoryStore - Block broadcast_85 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
20:13:54.738 INFO MemoryStore - Block broadcast_85_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
20:13:54.739 INFO BlockManagerInfo - Added broadcast_85_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.6 MiB)
20:13:54.739 INFO SparkContext - Created broadcast 85 from broadcast at DAGScheduler.scala:1580
20:13:54.739 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:54.739 INFO TaskSchedulerImpl - Adding task set 52.0 with 1 tasks resource profile 0
20:13:54.740 INFO TaskSetManager - Starting task 0.0 in stage 52.0 (TID 90) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:13:54.740 INFO Executor - Running task 0.0 in stage 52.0 (TID 90)
20:13:54.776 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:13:54.787 INFO Executor - Finished task 0.0 in stage 52.0 (TID 90). 989 bytes result sent to driver
20:13:54.788 INFO TaskSetManager - Finished task 0.0 in stage 52.0 (TID 90) in 48 ms on localhost (executor driver) (1/1)
20:13:54.788 INFO TaskSchedulerImpl - Removed TaskSet 52.0, whose tasks have all completed, from pool
20:13:54.788 INFO DAGScheduler - ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.074 s
20:13:54.788 INFO DAGScheduler - Job 38 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:54.788 INFO TaskSchedulerImpl - Killing all running tasks in stage 52: Stage finished
20:13:54.788 INFO DAGScheduler - Job 38 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.075297 s
20:13:54.792 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:54.792 INFO DAGScheduler - Got job 39 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:54.792 INFO DAGScheduler - Final stage: ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185)
20:13:54.792 INFO DAGScheduler - Parents of final stage: List()
20:13:54.792 INFO DAGScheduler - Missing parents: List()
20:13:54.793 INFO DAGScheduler - Submitting ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:54.809 INFO MemoryStore - Block broadcast_86 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:13:54.811 INFO MemoryStore - Block broadcast_86_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:13:54.811 INFO BlockManagerInfo - Added broadcast_86_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:13:54.811 INFO SparkContext - Created broadcast 86 from broadcast at DAGScheduler.scala:1580
20:13:54.812 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:54.812 INFO TaskSchedulerImpl - Adding task set 53.0 with 1 tasks resource profile 0
20:13:54.812 INFO TaskSetManager - Starting task 0.0 in stage 53.0 (TID 91) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:54.813 INFO Executor - Running task 0.0 in stage 53.0 (TID 91)
20:13:54.843 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam:0+237038
20:13:54.843 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.844 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.846 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.848 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.bai dst=null perm=null proto=rpc
20:13:54.848 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bai dst=null perm=null proto=rpc
20:13:54.850 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.851 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.852 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.853 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.853 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.858 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.859 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.860 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.860 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.861 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.863 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.864 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.864 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.865 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.866 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.867 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.867 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.868 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.869 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.870 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.870 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.872 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.873 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.873 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.876 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.876 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.878 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.879 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.880 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.880 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.881 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.882 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.883 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.883 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.884 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.885 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.886 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.886 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.887 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.888 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.889 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.889 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.890 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.891 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.892 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.892 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.893 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.895 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.896 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.897 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.898 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.898 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.900 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.901 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.902 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.903 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.904 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.904 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.905 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.906 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.907 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.907 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.908 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.909 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.909 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:13:54.910 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.910 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.911 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.911 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam dst=null perm=null proto=rpc
20:13:54.913 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bam.bai dst=null perm=null proto=rpc
20:13:54.913 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_2bc56dee-25b4-44ff-bc3d-f666c7ae6abd.bai dst=null perm=null proto=rpc
20:13:54.915 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:54.917 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:13:54.918 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:54.920 INFO Executor - Finished task 0.0 in stage 53.0 (TID 91). 989 bytes result sent to driver
20:13:54.921 INFO TaskSetManager - Finished task 0.0 in stage 53.0 (TID 91) in 108 ms on localhost (executor driver) (1/1)
20:13:54.921 INFO TaskSchedulerImpl - Removed TaskSet 53.0, whose tasks have all completed, from pool
20:13:54.921 INFO DAGScheduler - ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.128 s
20:13:54.921 INFO DAGScheduler - Job 39 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:54.921 INFO TaskSchedulerImpl - Killing all running tasks in stage 53: Stage finished
20:13:54.921 INFO DAGScheduler - Job 39 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.128913 s
20:13:54.926 INFO MemoryStore - Block broadcast_87 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
20:13:54.934 INFO MemoryStore - Block broadcast_87_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
20:13:54.934 INFO BlockManagerInfo - Added broadcast_87_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:13:54.934 INFO SparkContext - Created broadcast 87 from newAPIHadoopFile at PathSplitSource.java:96
20:13:54.961 INFO MemoryStore - Block broadcast_88 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
20:13:54.967 INFO MemoryStore - Block broadcast_88_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.9 MiB)
20:13:54.967 INFO BlockManagerInfo - Added broadcast_88_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:13:54.968 INFO SparkContext - Created broadcast 88 from newAPIHadoopFile at PathSplitSource.java:96
20:13:54.989 INFO FileInputFormat - Total input files to process : 1
20:13:54.991 INFO MemoryStore - Block broadcast_89 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:13:54.993 INFO MemoryStore - Block broadcast_89_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:13:54.993 INFO BlockManagerInfo - Added broadcast_89_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:13:54.993 INFO SparkContext - Created broadcast 89 from broadcast at ReadsSparkSink.java:133
20:13:54.994 INFO MemoryStore - Block broadcast_90 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
20:13:54.996 INFO MemoryStore - Block broadcast_90_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
20:13:54.996 INFO BlockManagerInfo - Added broadcast_90_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:13:54.996 INFO SparkContext - Created broadcast 90 from broadcast at BamSink.java:76
20:13:54.998 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts dst=null perm=null proto=rpc
20:13:54.999 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:54.999 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:54.999 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:55.000 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:55.006 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:13:55.007 INFO DAGScheduler - Registering RDD 204 (mapToPair at SparkUtils.java:161) as input to shuffle 12
20:13:55.007 INFO DAGScheduler - Got job 40 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:13:55.007 INFO DAGScheduler - Final stage: ResultStage 55 (runJob at SparkHadoopWriter.scala:83)
20:13:55.007 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 54)
20:13:55.007 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 54)
20:13:55.007 INFO DAGScheduler - Submitting ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161), which has no missing parents
20:13:55.025 INFO MemoryStore - Block broadcast_91 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
20:13:55.026 INFO MemoryStore - Block broadcast_91_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
20:13:55.026 INFO BlockManagerInfo - Added broadcast_91_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.2 MiB)
20:13:55.027 INFO SparkContext - Created broadcast 91 from broadcast at DAGScheduler.scala:1580
20:13:55.027 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:13:55.027 INFO TaskSchedulerImpl - Adding task set 54.0 with 1 tasks resource profile 0
20:13:55.028 INFO TaskSetManager - Starting task 0.0 in stage 54.0 (TID 92) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
20:13:55.028 INFO Executor - Running task 0.0 in stage 54.0 (TID 92)
20:13:55.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741840_1016 replica FinalizedReplica, blk_1073741840_1016, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741840 for deletion
20:13:55.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741845_1021 replica FinalizedReplica, blk_1073741845_1021, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741845 for deletion
20:13:55.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741845_1021 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741845
20:13:55.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741840_1016 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741840
20:13:55.059 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:13:55.079 INFO Executor - Finished task 0.0 in stage 54.0 (TID 92). 1148 bytes result sent to driver
20:13:55.079 INFO TaskSetManager - Finished task 0.0 in stage 54.0 (TID 92) in 52 ms on localhost (executor driver) (1/1)
20:13:55.079 INFO TaskSchedulerImpl - Removed TaskSet 54.0, whose tasks have all completed, from pool
20:13:55.080 INFO DAGScheduler - ShuffleMapStage 54 (mapToPair at SparkUtils.java:161) finished in 0.072 s
20:13:55.080 INFO DAGScheduler - looking for newly runnable stages
20:13:55.080 INFO DAGScheduler - running: HashSet()
20:13:55.080 INFO DAGScheduler - waiting: HashSet(ResultStage 55)
20:13:55.080 INFO DAGScheduler - failed: HashSet()
20:13:55.080 INFO DAGScheduler - Submitting ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91), which has no missing parents
20:13:55.092 INFO MemoryStore - Block broadcast_92 stored as values in memory (estimated size 241.5 KiB, free 1915.7 MiB)
20:13:55.100 INFO MemoryStore - Block broadcast_92_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.0 MiB)
20:13:55.100 INFO BlockManagerInfo - Removed broadcast_88_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.2 MiB)
20:13:55.100 INFO BlockManagerInfo - Added broadcast_92_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.2 MiB)
20:13:55.101 INFO SparkContext - Created broadcast 92 from broadcast at DAGScheduler.scala:1580
20:13:55.101 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:13:55.101 INFO TaskSchedulerImpl - Adding task set 55.0 with 1 tasks resource profile 0
20:13:55.101 INFO BlockManagerInfo - Removed broadcast_83_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.2 MiB)
20:13:55.102 INFO TaskSetManager - Starting task 0.0 in stage 55.0 (TID 93) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:13:55.102 INFO BlockManagerInfo - Removed broadcast_84_piece0 on localhost:35739 in memory (size: 153.7 KiB, free: 1919.4 MiB)
20:13:55.102 INFO Executor - Running task 0.0 in stage 55.0 (TID 93)
20:13:55.103 INFO BlockManagerInfo - Removed broadcast_86_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.5 MiB)
20:13:55.104 INFO BlockManagerInfo - Removed broadcast_77_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:13:55.104 INFO BlockManagerInfo - Removed broadcast_85_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:13:55.111 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:13:55.111 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:13:55.134 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:55.135 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:55.135 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:55.135 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:55.135 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:55.135 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:55.136 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:55.137 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:55.139 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:55.143 INFO StateChange - BLOCK* allocate blk_1073741852_1028, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/part-r-00000
20:13:55.144 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741852_1028 src: /127.0.0.1:44504 dest: /127.0.0.1:38353
20:13:55.148 INFO clienttrace - src: /127.0.0.1:44504, dest: /127.0.0.1:38353, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741852_1028, duration(ns): 3520562
20:13:55.148 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741852_1028, type=LAST_IN_PIPELINE terminating
20:13:55.149 INFO FSNamesystem - BLOCK* blk_1073741852_1028 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/part-r-00000
20:13:55.551 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:55.552 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:13:55.553 INFO StateChange - BLOCK* allocate blk_1073741853_1029, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.sbi
20:13:55.553 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741853_1029 src: /127.0.0.1:44512 dest: /127.0.0.1:38353
20:13:55.555 INFO clienttrace - src: /127.0.0.1:44512, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741853_1029, duration(ns): 394098
20:13:55.555 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741853_1029, type=LAST_IN_PIPELINE terminating
20:13:55.555 INFO FSNamesystem - BLOCK* blk_1073741853_1029 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.sbi
20:13:55.956 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:55.958 INFO StateChange - BLOCK* allocate blk_1073741854_1030, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.bai
20:13:55.959 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741854_1030 src: /127.0.0.1:44522 dest: /127.0.0.1:38353
20:13:55.960 INFO clienttrace - src: /127.0.0.1:44522, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741854_1030, duration(ns): 487136
20:13:55.961 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741854_1030, type=LAST_IN_PIPELINE terminating
20:13:55.961 INFO FSNamesystem - BLOCK* blk_1073741854_1030 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.bai
20:13:56.362 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:56.363 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0 dst=null perm=null proto=rpc
20:13:56.364 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0 dst=null perm=null proto=rpc
20:13:56.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/task_202502102013548336384537597567776_0209_r_000000 dst=null perm=null proto=rpc
20:13:56.366 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/_temporary/attempt_202502102013548336384537597567776_0209_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/task_202502102013548336384537597567776_0209_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:56.366 INFO FileOutputCommitter - Saved output of task 'attempt_202502102013548336384537597567776_0209_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/task_202502102013548336384537597567776_0209_r_000000
20:13:56.366 INFO SparkHadoopMapRedUtil - attempt_202502102013548336384537597567776_0209_r_000000_0: Committed. Elapsed time: 1 ms.
20:13:56.367 INFO Executor - Finished task 0.0 in stage 55.0 (TID 93). 1858 bytes result sent to driver
20:13:56.367 INFO TaskSetManager - Finished task 0.0 in stage 55.0 (TID 93) in 1265 ms on localhost (executor driver) (1/1)
20:13:56.367 INFO TaskSchedulerImpl - Removed TaskSet 55.0, whose tasks have all completed, from pool
20:13:56.368 INFO DAGScheduler - ResultStage 55 (runJob at SparkHadoopWriter.scala:83) finished in 1.287 s
20:13:56.368 INFO DAGScheduler - Job 40 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:56.368 INFO TaskSchedulerImpl - Killing all running tasks in stage 55: Stage finished
20:13:56.368 INFO DAGScheduler - Job 40 finished: runJob at SparkHadoopWriter.scala:83, took 1.361979 s
20:13:56.369 INFO SparkHadoopWriter - Start to commit write Job job_202502102013548336384537597567776_0209.
20:13:56.369 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:13:56.370 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts dst=null perm=null proto=rpc
20:13:56.370 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/task_202502102013548336384537597567776_0209_r_000000 dst=null perm=null proto=rpc
20:13:56.371 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:56.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/task_202502102013548336384537597567776_0209_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:56.372 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:56.373 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/task_202502102013548336384537597567776_0209_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:56.373 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:13:56.374 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary/0/task_202502102013548336384537597567776_0209_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:56.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_temporary dst=null perm=null proto=rpc
20:13:56.376 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:56.377 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:56.378 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.spark-staging-209 dst=null perm=null proto=rpc
20:13:56.378 INFO SparkHadoopWriter - Write Job job_202502102013548336384537597567776_0209 committed. Elapsed time: 9 ms.
20:13:56.379 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:56.381 INFO StateChange - BLOCK* allocate blk_1073741855_1031, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/header
20:13:56.383 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741855_1031 src: /127.0.0.1:44536 dest: /127.0.0.1:38353
20:13:56.384 INFO clienttrace - src: /127.0.0.1:44536, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741855_1031, duration(ns): 435886
20:13:56.384 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741855_1031, type=LAST_IN_PIPELINE terminating
20:13:56.385 INFO FSNamesystem - BLOCK* blk_1073741855_1031 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/header
20:13:56.786 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:56.788 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:56.789 INFO StateChange - BLOCK* allocate blk_1073741856_1032, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/terminator
20:13:56.790 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741856_1032 src: /127.0.0.1:44550 dest: /127.0.0.1:38353
20:13:56.791 INFO clienttrace - src: /127.0.0.1:44550, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741856_1032, duration(ns): 429772
20:13:56.791 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741856_1032, type=LAST_IN_PIPELINE terminating
20:13:56.792 INFO FSNamesystem - BLOCK* blk_1073741856_1032 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/terminator
20:13:57.193 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:57.194 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts dst=null perm=null proto=rpc
20:13:57.195 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:57.196 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:57.196 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam
20:13:57.196 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:57.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:57.198 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam done
20:13:57.198 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.198 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi
20:13:57.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts dst=null perm=null proto=rpc
20:13:57.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:57.200 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:57.201 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:57.202 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:13:57.203 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:57.203 INFO StateChange - BLOCK* allocate blk_1073741857_1033, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi
20:13:57.204 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741857_1033 src: /127.0.0.1:44560 dest: /127.0.0.1:38353
20:13:57.205 INFO clienttrace - src: /127.0.0.1:44560, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741857_1033, duration(ns): 410937
20:13:57.205 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741857_1033, type=LAST_IN_PIPELINE terminating
20:13:57.206 INFO FSNamesystem - BLOCK* blk_1073741857_1033 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi
20:13:57.607 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:57.608 INFO IndexFileMerger - Done merging .sbi files
20:13:57.608 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai
20:13:57.608 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts dst=null perm=null proto=rpc
20:13:57.609 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:57.610 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:57.610 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:57.612 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:57.612 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:57.614 INFO StateChange - BLOCK* allocate blk_1073741858_1034, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai
20:13:57.615 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741858_1034 src: /127.0.0.1:37706 dest: /127.0.0.1:38353
20:13:57.616 INFO clienttrace - src: /127.0.0.1:37706, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741858_1034, duration(ns): 576059
20:13:57.616 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741858_1034, type=LAST_IN_PIPELINE terminating
20:13:57.617 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:57.618 INFO IndexFileMerger - Done merging .bai files
20:13:57.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.parts dst=null perm=null proto=rpc
20:13:57.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.639 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi dst=null perm=null proto=rpc
20:13:57.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi dst=null perm=null proto=rpc
20:13:57.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi dst=null perm=null proto=rpc
20:13:57.641 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:13:57.642 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.642 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.643 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.643 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.644 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.646 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:57.648 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:57.649 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi dst=null perm=null proto=rpc
20:13:57.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi dst=null perm=null proto=rpc
20:13:57.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.sbi dst=null perm=null proto=rpc
20:13:57.651 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:13:57.652 INFO MemoryStore - Block broadcast_93 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
20:13:57.652 INFO MemoryStore - Block broadcast_93_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
20:13:57.653 INFO BlockManagerInfo - Added broadcast_93_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.7 MiB)
20:13:57.653 INFO SparkContext - Created broadcast 93 from broadcast at BamSource.java:104
20:13:57.654 INFO MemoryStore - Block broadcast_94 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:13:57.665 INFO MemoryStore - Block broadcast_94_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:13:57.665 INFO BlockManagerInfo - Added broadcast_94_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:13:57.665 INFO SparkContext - Created broadcast 94 from newAPIHadoopFile at PathSplitSource.java:96
20:13:57.676 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.676 INFO FileInputFormat - Total input files to process : 1
20:13:57.677 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.691 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:13:57.692 INFO DAGScheduler - Got job 41 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:13:57.692 INFO DAGScheduler - Final stage: ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182)
20:13:57.692 INFO DAGScheduler - Parents of final stage: List()
20:13:57.692 INFO DAGScheduler - Missing parents: List()
20:13:57.692 INFO DAGScheduler - Submitting ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:57.699 INFO MemoryStore - Block broadcast_95 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
20:13:57.700 INFO MemoryStore - Block broadcast_95_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
20:13:57.700 INFO BlockManagerInfo - Added broadcast_95_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:13:57.700 INFO SparkContext - Created broadcast 95 from broadcast at DAGScheduler.scala:1580
20:13:57.700 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:57.700 INFO TaskSchedulerImpl - Adding task set 56.0 with 1 tasks resource profile 0
20:13:57.701 INFO TaskSetManager - Starting task 0.0 in stage 56.0 (TID 94) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:57.701 INFO Executor - Running task 0.0 in stage 56.0 (TID 94)
20:13:57.714 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam:0+235514
20:13:57.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.717 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.717 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.718 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.720 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:57.722 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:57.722 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:57.724 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
20:13:57.725 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:57.729 INFO Executor - Finished task 0.0 in stage 56.0 (TID 94). 650184 bytes result sent to driver
20:13:57.732 INFO TaskSetManager - Finished task 0.0 in stage 56.0 (TID 94) in 31 ms on localhost (executor driver) (1/1)
20:13:57.732 INFO TaskSchedulerImpl - Removed TaskSet 56.0, whose tasks have all completed, from pool
20:13:57.732 INFO DAGScheduler - ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.039 s
20:13:57.733 INFO DAGScheduler - Job 41 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:57.733 INFO TaskSchedulerImpl - Killing all running tasks in stage 56: Stage finished
20:13:57.733 INFO DAGScheduler - Job 41 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.041139 s
20:13:57.748 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:57.748 INFO DAGScheduler - Got job 42 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:57.748 INFO DAGScheduler - Final stage: ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185)
20:13:57.748 INFO DAGScheduler - Parents of final stage: List()
20:13:57.748 INFO DAGScheduler - Missing parents: List()
20:13:57.749 INFO DAGScheduler - Submitting ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:57.772 INFO MemoryStore - Block broadcast_96 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:13:57.773 INFO MemoryStore - Block broadcast_96_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
20:13:57.774 INFO BlockManagerInfo - Added broadcast_96_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:13:57.774 INFO SparkContext - Created broadcast 96 from broadcast at DAGScheduler.scala:1580
20:13:57.774 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:57.774 INFO TaskSchedulerImpl - Adding task set 57.0 with 1 tasks resource profile 0
20:13:57.775 INFO TaskSetManager - Starting task 0.0 in stage 57.0 (TID 95) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
20:13:57.775 INFO Executor - Running task 0.0 in stage 57.0 (TID 95)
20:13:57.810 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:13:57.822 INFO Executor - Finished task 0.0 in stage 57.0 (TID 95). 989 bytes result sent to driver
20:13:57.823 INFO TaskSetManager - Finished task 0.0 in stage 57.0 (TID 95) in 48 ms on localhost (executor driver) (1/1)
20:13:57.823 INFO TaskSchedulerImpl - Removed TaskSet 57.0, whose tasks have all completed, from pool
20:13:57.823 INFO DAGScheduler - ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.074 s
20:13:57.823 INFO DAGScheduler - Job 42 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:57.823 INFO TaskSchedulerImpl - Killing all running tasks in stage 57: Stage finished
20:13:57.823 INFO DAGScheduler - Job 42 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.075293 s
20:13:57.827 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:13:57.827 INFO DAGScheduler - Got job 43 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:13:57.827 INFO DAGScheduler - Final stage: ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185)
20:13:57.827 INFO DAGScheduler - Parents of final stage: List()
20:13:57.827 INFO DAGScheduler - Missing parents: List()
20:13:57.827 INFO DAGScheduler - Submitting ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
20:13:57.839 INFO MemoryStore - Block broadcast_97 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
20:13:57.840 INFO MemoryStore - Block broadcast_97_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
20:13:57.840 INFO BlockManagerInfo - Added broadcast_97_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.4 MiB)
20:13:57.840 INFO SparkContext - Created broadcast 97 from broadcast at DAGScheduler.scala:1580
20:13:57.840 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:13:57.840 INFO TaskSchedulerImpl - Adding task set 58.0 with 1 tasks resource profile 0
20:13:57.841 INFO TaskSetManager - Starting task 0.0 in stage 58.0 (TID 96) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:13:57.841 INFO Executor - Running task 0.0 in stage 58.0 (TID 96)
20:13:57.853 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam:0+235514
20:13:57.854 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.854 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam dst=null perm=null proto=rpc
20:13:57.855 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.856 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.856 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_9e5c17cd-a7b1-413b-ad6f-0c31bd76475a.bam.bai dst=null perm=null proto=rpc
20:13:57.858 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:13:57.860 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:57.861 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:13:57.862 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
20:13:57.863 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:13:57.864 INFO Executor - Finished task 0.0 in stage 58.0 (TID 96). 989 bytes result sent to driver
20:13:57.865 INFO TaskSetManager - Finished task 0.0 in stage 58.0 (TID 96) in 24 ms on localhost (executor driver) (1/1)
20:13:57.865 INFO TaskSchedulerImpl - Removed TaskSet 58.0, whose tasks have all completed, from pool
20:13:57.865 INFO DAGScheduler - ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.037 s
20:13:57.865 INFO DAGScheduler - Job 43 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:57.865 INFO TaskSchedulerImpl - Killing all running tasks in stage 58: Stage finished
20:13:57.865 INFO DAGScheduler - Job 43 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.038702 s
20:13:57.869 INFO MemoryStore - Block broadcast_98 stored as values in memory (estimated size 298.0 KiB, free 1916.8 MiB)
20:13:57.875 INFO MemoryStore - Block broadcast_98_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:13:57.875 INFO BlockManagerInfo - Added broadcast_98_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:13:57.876 INFO SparkContext - Created broadcast 98 from newAPIHadoopFile at PathSplitSource.java:96
20:13:57.897 INFO MemoryStore - Block broadcast_99 stored as values in memory (estimated size 298.0 KiB, free 1916.4 MiB)
20:13:57.904 INFO MemoryStore - Block broadcast_99_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:13:57.904 INFO BlockManagerInfo - Added broadcast_99_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:13:57.904 INFO SparkContext - Created broadcast 99 from newAPIHadoopFile at PathSplitSource.java:96
20:13:57.925 INFO FileInputFormat - Total input files to process : 1
20:13:57.926 INFO MemoryStore - Block broadcast_100 stored as values in memory (estimated size 19.6 KiB, free 1916.4 MiB)
20:13:57.927 INFO MemoryStore - Block broadcast_100_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
20:13:57.927 INFO BlockManagerInfo - Added broadcast_100_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.3 MiB)
20:13:57.927 INFO SparkContext - Created broadcast 100 from broadcast at ReadsSparkSink.java:133
20:13:57.928 INFO MemoryStore - Block broadcast_101 stored as values in memory (estimated size 20.0 KiB, free 1916.3 MiB)
20:13:57.934 INFO MemoryStore - Block broadcast_101_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.9 MiB)
20:13:57.935 INFO BlockManagerInfo - Removed broadcast_96_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.4 MiB)
20:13:57.935 INFO BlockManagerInfo - Added broadcast_101_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.4 MiB)
20:13:57.935 INFO SparkContext - Created broadcast 101 from broadcast at BamSink.java:76
20:13:57.935 INFO BlockManagerInfo - Removed broadcast_99_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:13:57.936 INFO BlockManagerInfo - Removed broadcast_87_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.5 MiB)
20:13:57.936 INFO BlockManagerInfo - Removed broadcast_92_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.6 MiB)
20:13:57.937 INFO BlockManagerInfo - Removed broadcast_94_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:13:57.938 INFO BlockManagerInfo - Removed broadcast_95_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.7 MiB)
20:13:57.939 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts dst=null perm=null proto=rpc
20:13:57.939 INFO BlockManagerInfo - Removed broadcast_89_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:13:57.939 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:57.939 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:57.939 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:57.939 INFO BlockManagerInfo - Removed broadcast_93_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.7 MiB)
20:13:57.940 INFO BlockManagerInfo - Removed broadcast_90_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:13:57.940 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:57.940 INFO BlockManagerInfo - Removed broadcast_91_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.9 MiB)
20:13:57.941 INFO BlockManagerInfo - Removed broadcast_97_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.9 MiB)
20:13:57.947 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:13:57.947 INFO DAGScheduler - Registering RDD 229 (mapToPair at SparkUtils.java:161) as input to shuffle 13
20:13:57.948 INFO DAGScheduler - Got job 44 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:13:57.948 INFO DAGScheduler - Final stage: ResultStage 60 (runJob at SparkHadoopWriter.scala:83)
20:13:57.948 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 59)
20:13:57.948 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 59)
20:13:57.948 INFO DAGScheduler - Submitting ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161), which has no missing parents
20:13:57.971 INFO MemoryStore - Block broadcast_102 stored as values in memory (estimated size 434.3 KiB, free 1919.2 MiB)
20:13:57.972 INFO MemoryStore - Block broadcast_102_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1919.0 MiB)
20:13:57.972 INFO BlockManagerInfo - Added broadcast_102_piece0 in memory on localhost:35739 (size: 157.6 KiB, free: 1919.8 MiB)
20:13:57.973 INFO SparkContext - Created broadcast 102 from broadcast at DAGScheduler.scala:1580
20:13:57.973 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:13:57.973 INFO TaskSchedulerImpl - Adding task set 59.0 with 1 tasks resource profile 0
20:13:57.974 INFO TaskSetManager - Starting task 0.0 in stage 59.0 (TID 97) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
20:13:57.974 INFO Executor - Running task 0.0 in stage 59.0 (TID 97)
20:13:58.004 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:13:58.018 INFO Executor - Finished task 0.0 in stage 59.0 (TID 97). 1148 bytes result sent to driver
20:13:58.019 INFO TaskSetManager - Finished task 0.0 in stage 59.0 (TID 97) in 46 ms on localhost (executor driver) (1/1)
20:13:58.019 INFO TaskSchedulerImpl - Removed TaskSet 59.0, whose tasks have all completed, from pool
20:13:58.019 INFO DAGScheduler - ShuffleMapStage 59 (mapToPair at SparkUtils.java:161) finished in 0.071 s
20:13:58.019 INFO DAGScheduler - looking for newly runnable stages
20:13:58.019 INFO DAGScheduler - running: HashSet()
20:13:58.019 INFO DAGScheduler - waiting: HashSet(ResultStage 60)
20:13:58.019 INFO DAGScheduler - failed: HashSet()
20:13:58.020 INFO DAGScheduler - Submitting ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91), which has no missing parents
20:13:58.030 INFO MemoryStore - Block broadcast_103 stored as values in memory (estimated size 155.4 KiB, free 1918.9 MiB)
20:13:58.031 INFO MemoryStore - Block broadcast_103_piece0 stored as bytes in memory (estimated size 58.6 KiB, free 1918.8 MiB)
20:13:58.031 INFO BlockManagerInfo - Added broadcast_103_piece0 in memory on localhost:35739 (size: 58.6 KiB, free: 1919.7 MiB)
20:13:58.032 INFO SparkContext - Created broadcast 103 from broadcast at DAGScheduler.scala:1580
20:13:58.032 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:13:58.032 INFO TaskSchedulerImpl - Adding task set 60.0 with 1 tasks resource profile 0
20:13:58.033 INFO TaskSetManager - Starting task 0.0 in stage 60.0 (TID 98) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:13:58.033 INFO Executor - Running task 0.0 in stage 60.0 (TID 98)
20:13:58.038 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:13:58.038 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:13:58.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741853_1029 replica FinalizedReplica, blk_1073741853_1029, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741853 for deletion
20:13:58.049 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741853_1029 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741853
20:13:58.052 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:58.052 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:58.052 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:58.053 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:13:58.053 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:13:58.053 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:13:58.054 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.055 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.056 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.059 INFO StateChange - BLOCK* allocate blk_1073741859_1035, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/part-r-00000
20:13:58.060 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741859_1035 src: /127.0.0.1:37742 dest: /127.0.0.1:38353
20:13:58.063 INFO clienttrace - src: /127.0.0.1:37742, dest: /127.0.0.1:38353, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741859_1035, duration(ns): 2683448
20:13:58.063 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741859_1035, type=LAST_IN_PIPELINE terminating
20:13:58.064 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:58.065 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:13:58.065 INFO StateChange - BLOCK* allocate blk_1073741860_1036, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.sbi
20:13:58.066 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741860_1036 src: /127.0.0.1:37744 dest: /127.0.0.1:38353
20:13:58.067 INFO clienttrace - src: /127.0.0.1:37744, dest: /127.0.0.1:38353, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741860_1036, duration(ns): 366777
20:13:58.067 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741860_1036, type=LAST_IN_PIPELINE terminating
20:13:58.068 INFO FSNamesystem - BLOCK* blk_1073741860_1036 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.sbi
20:13:58.469 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:58.470 INFO StateChange - BLOCK* allocate blk_1073741861_1037, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.bai
20:13:58.471 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741861_1037 src: /127.0.0.1:37752 dest: /127.0.0.1:38353
20:13:58.472 INFO clienttrace - src: /127.0.0.1:37752, dest: /127.0.0.1:38353, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741861_1037, duration(ns): 404390
20:13:58.472 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741861_1037, type=LAST_IN_PIPELINE terminating
20:13:58.472 INFO FSNamesystem - BLOCK* blk_1073741861_1037 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.bai
20:13:58.873 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:58.874 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0 dst=null perm=null proto=rpc
20:13:58.875 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0 dst=null perm=null proto=rpc
20:13:58.876 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/task_20250210201357203934775909965574_0234_r_000000 dst=null perm=null proto=rpc
20:13:58.877 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/_temporary/attempt_20250210201357203934775909965574_0234_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/task_20250210201357203934775909965574_0234_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:13:58.877 INFO FileOutputCommitter - Saved output of task 'attempt_20250210201357203934775909965574_0234_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/task_20250210201357203934775909965574_0234_r_000000
20:13:58.877 INFO SparkHadoopMapRedUtil - attempt_20250210201357203934775909965574_0234_r_000000_0: Committed. Elapsed time: 2 ms.
20:13:58.878 INFO Executor - Finished task 0.0 in stage 60.0 (TID 98). 1858 bytes result sent to driver
20:13:58.878 INFO TaskSetManager - Finished task 0.0 in stage 60.0 (TID 98) in 845 ms on localhost (executor driver) (1/1)
20:13:58.879 INFO TaskSchedulerImpl - Removed TaskSet 60.0, whose tasks have all completed, from pool
20:13:58.879 INFO DAGScheduler - ResultStage 60 (runJob at SparkHadoopWriter.scala:83) finished in 0.859 s
20:13:58.879 INFO DAGScheduler - Job 44 is finished. Cancelling potential speculative or zombie tasks for this job
20:13:58.879 INFO TaskSchedulerImpl - Killing all running tasks in stage 60: Stage finished
20:13:58.879 INFO DAGScheduler - Job 44 finished: runJob at SparkHadoopWriter.scala:83, took 0.932433 s
20:13:58.880 INFO SparkHadoopWriter - Start to commit write Job job_20250210201357203934775909965574_0234.
20:13:58.880 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:13:58.881 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts dst=null perm=null proto=rpc
20:13:58.882 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/task_20250210201357203934775909965574_0234_r_000000 dst=null perm=null proto=rpc
20:13:58.882 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:58.883 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/task_20250210201357203934775909965574_0234_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.883 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:58.884 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/task_20250210201357203934775909965574_0234_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.884 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:13:58.885 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary/0/task_20250210201357203934775909965574_0234_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.886 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_temporary dst=null perm=null proto=rpc
20:13:58.886 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.887 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:58.888 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.spark-staging-234 dst=null perm=null proto=rpc
20:13:58.888 INFO SparkHadoopWriter - Write Job job_20250210201357203934775909965574_0234 committed. Elapsed time: 8 ms.
20:13:58.888 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:58.890 INFO StateChange - BLOCK* allocate blk_1073741862_1038, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/header
20:13:58.891 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741862_1038 src: /127.0.0.1:37758 dest: /127.0.0.1:38353
20:13:58.892 INFO clienttrace - src: /127.0.0.1:37758, dest: /127.0.0.1:38353, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741862_1038, duration(ns): 406304
20:13:58.892 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741862_1038, type=LAST_IN_PIPELINE terminating
20:13:58.892 INFO FSNamesystem - BLOCK* blk_1073741862_1038 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/header
20:13:59.293 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:59.295 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:59.296 INFO StateChange - BLOCK* allocate blk_1073741863_1039, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/terminator
20:13:59.297 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741863_1039 src: /127.0.0.1:37768 dest: /127.0.0.1:38353
20:13:59.298 INFO clienttrace - src: /127.0.0.1:37768, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741863_1039, duration(ns): 435638
20:13:59.298 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741863_1039, type=LAST_IN_PIPELINE terminating
20:13:59.299 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:59.299 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts dst=null perm=null proto=rpc
20:13:59.301 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:59.301 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:59.301 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam
20:13:59.302 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:59.302 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:13:59.303 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:59.303 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam done
20:13:59.303 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:13:59.304 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi
20:13:59.304 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts dst=null perm=null proto=rpc
20:13:59.305 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:59.306 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:59.306 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:59.307 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:13:59.308 INFO StateChange - BLOCK* allocate blk_1073741864_1040, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi
20:13:59.309 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741864_1040 src: /127.0.0.1:37774 dest: /127.0.0.1:38353
20:13:59.310 INFO clienttrace - src: /127.0.0.1:37774, dest: /127.0.0.1:38353, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741864_1040, duration(ns): 373239
20:13:59.310 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741864_1040, type=LAST_IN_PIPELINE terminating
20:13:59.311 INFO FSNamesystem - BLOCK* blk_1073741864_1040 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi
20:13:59.712 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:13:59.713 INFO IndexFileMerger - Done merging .sbi files
20:13:59.713 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai
20:13:59.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts dst=null perm=null proto=rpc
20:13:59.714 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:13:59.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:59.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:59.717 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:13:59.718 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:13:59.721 INFO StateChange - BLOCK* allocate blk_1073741865_1041, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai
20:13:59.722 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741865_1041 src: /127.0.0.1:37790 dest: /127.0.0.1:38353
20:13:59.723 INFO clienttrace - src: /127.0.0.1:37790, dest: /127.0.0.1:38353, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741865_1041, duration(ns): 424206
20:13:59.723 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741865_1041, type=LAST_IN_PIPELINE terminating
20:13:59.723 INFO FSNamesystem - BLOCK* blk_1073741865_1041 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai
20:14:00.124 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:00.125 INFO IndexFileMerger - Done merging .bai files
20:14:00.125 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.parts dst=null perm=null proto=rpc
20:14:00.134 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.142 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi dst=null perm=null proto=rpc
20:14:00.142 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi dst=null perm=null proto=rpc
20:14:00.143 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi dst=null perm=null proto=rpc
20:14:00.144 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
20:14:00.144 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.144 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.145 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.145 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.146 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.147 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.148 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.149 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
20:14:00.150 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:00.151 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
20:14:00.152 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi dst=null perm=null proto=rpc
20:14:00.152 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi dst=null perm=null proto=rpc
20:14:00.153 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.sbi dst=null perm=null proto=rpc
20:14:00.153 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
20:14:00.154 INFO MemoryStore - Block broadcast_104 stored as values in memory (estimated size 312.0 B, free 1918.8 MiB)
20:14:00.154 INFO MemoryStore - Block broadcast_104_piece0 stored as bytes in memory (estimated size 231.0 B, free 1918.8 MiB)
20:14:00.155 INFO BlockManagerInfo - Added broadcast_104_piece0 in memory on localhost:35739 (size: 231.0 B, free: 1919.7 MiB)
20:14:00.155 INFO SparkContext - Created broadcast 104 from broadcast at BamSource.java:104
20:14:00.156 INFO MemoryStore - Block broadcast_105 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
20:14:00.162 INFO MemoryStore - Block broadcast_105_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.5 MiB)
20:14:00.162 INFO BlockManagerInfo - Added broadcast_105_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:00.163 INFO SparkContext - Created broadcast 105 from newAPIHadoopFile at PathSplitSource.java:96
20:14:00.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.172 INFO FileInputFormat - Total input files to process : 1
20:14:00.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.192 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:00.193 INFO DAGScheduler - Got job 45 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:00.193 INFO DAGScheduler - Final stage: ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:00.193 INFO DAGScheduler - Parents of final stage: List()
20:14:00.193 INFO DAGScheduler - Missing parents: List()
20:14:00.193 INFO DAGScheduler - Submitting ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:00.200 INFO MemoryStore - Block broadcast_106 stored as values in memory (estimated size 148.2 KiB, free 1918.3 MiB)
20:14:00.201 INFO MemoryStore - Block broadcast_106_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.3 MiB)
20:14:00.201 INFO BlockManagerInfo - Added broadcast_106_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:00.201 INFO SparkContext - Created broadcast 106 from broadcast at DAGScheduler.scala:1580
20:14:00.201 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:00.201 INFO TaskSchedulerImpl - Adding task set 61.0 with 1 tasks resource profile 0
20:14:00.202 INFO TaskSetManager - Starting task 0.0 in stage 61.0 (TID 99) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:00.202 INFO Executor - Running task 0.0 in stage 61.0 (TID 99)
20:14:00.216 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam:0+236517
20:14:00.217 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.217 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.218 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.219 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.219 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.220 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
20:14:00.222 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:00.223 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:00.224 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:00.228 INFO Executor - Finished task 0.0 in stage 61.0 (TID 99). 749513 bytes result sent to driver
20:14:00.230 INFO TaskSetManager - Finished task 0.0 in stage 61.0 (TID 99) in 28 ms on localhost (executor driver) (1/1)
20:14:00.230 INFO TaskSchedulerImpl - Removed TaskSet 61.0, whose tasks have all completed, from pool
20:14:00.230 INFO DAGScheduler - ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.036 s
20:14:00.230 INFO DAGScheduler - Job 45 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:00.230 INFO TaskSchedulerImpl - Killing all running tasks in stage 61: Stage finished
20:14:00.230 INFO DAGScheduler - Job 45 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.037553 s
20:14:00.253 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:00.253 INFO DAGScheduler - Got job 46 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:00.253 INFO DAGScheduler - Final stage: ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185)
20:14:00.253 INFO DAGScheduler - Parents of final stage: List()
20:14:00.253 INFO DAGScheduler - Missing parents: List()
20:14:00.253 INFO DAGScheduler - Submitting ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:00.270 INFO MemoryStore - Block broadcast_107 stored as values in memory (estimated size 426.1 KiB, free 1917.9 MiB)
20:14:00.271 INFO MemoryStore - Block broadcast_107_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.7 MiB)
20:14:00.272 INFO BlockManagerInfo - Added broadcast_107_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:00.272 INFO SparkContext - Created broadcast 107 from broadcast at DAGScheduler.scala:1580
20:14:00.272 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:00.272 INFO TaskSchedulerImpl - Adding task set 62.0 with 1 tasks resource profile 0
20:14:00.273 INFO TaskSetManager - Starting task 0.0 in stage 62.0 (TID 100) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
20:14:00.273 INFO Executor - Running task 0.0 in stage 62.0 (TID 100)
20:14:00.305 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:00.313 INFO Executor - Finished task 0.0 in stage 62.0 (TID 100). 989 bytes result sent to driver
20:14:00.314 INFO TaskSetManager - Finished task 0.0 in stage 62.0 (TID 100) in 41 ms on localhost (executor driver) (1/1)
20:14:00.314 INFO TaskSchedulerImpl - Removed TaskSet 62.0, whose tasks have all completed, from pool
20:14:00.314 INFO DAGScheduler - ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
20:14:00.314 INFO DAGScheduler - Job 46 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:00.314 INFO TaskSchedulerImpl - Killing all running tasks in stage 62: Stage finished
20:14:00.314 INFO DAGScheduler - Job 46 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.061576 s
20:14:00.318 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:00.318 INFO DAGScheduler - Got job 47 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:00.318 INFO DAGScheduler - Final stage: ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185)
20:14:00.318 INFO DAGScheduler - Parents of final stage: List()
20:14:00.318 INFO DAGScheduler - Missing parents: List()
20:14:00.318 INFO DAGScheduler - Submitting ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:00.329 INFO MemoryStore - Block broadcast_108 stored as values in memory (estimated size 148.1 KiB, free 1917.6 MiB)
20:14:00.330 INFO MemoryStore - Block broadcast_108_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
20:14:00.330 INFO BlockManagerInfo - Added broadcast_108_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.4 MiB)
20:14:00.330 INFO SparkContext - Created broadcast 108 from broadcast at DAGScheduler.scala:1580
20:14:00.330 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:00.330 INFO TaskSchedulerImpl - Adding task set 63.0 with 1 tasks resource profile 0
20:14:00.331 INFO TaskSetManager - Starting task 0.0 in stage 63.0 (TID 101) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:00.331 INFO Executor - Running task 0.0 in stage 63.0 (TID 101)
20:14:00.343 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam:0+236517
20:14:00.344 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.345 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam dst=null perm=null proto=rpc
20:14:00.345 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.346 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.346 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_13489012-7fd4-4a4d-8efe-a46af2687330.bam.bai dst=null perm=null proto=rpc
20:14:00.348 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
20:14:00.349 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:00.349 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:00.350 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
20:14:00.351 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:00.352 INFO Executor - Finished task 0.0 in stage 63.0 (TID 101). 989 bytes result sent to driver
20:14:00.353 INFO TaskSetManager - Finished task 0.0 in stage 63.0 (TID 101) in 22 ms on localhost (executor driver) (1/1)
20:14:00.353 INFO TaskSchedulerImpl - Removed TaskSet 63.0, whose tasks have all completed, from pool
20:14:00.353 INFO DAGScheduler - ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.034 s
20:14:00.353 INFO DAGScheduler - Job 47 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:00.353 INFO TaskSchedulerImpl - Killing all running tasks in stage 63: Stage finished
20:14:00.353 INFO DAGScheduler - Job 47 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.035539 s
20:14:00.359 INFO MemoryStore - Block broadcast_109 stored as values in memory (estimated size 576.0 B, free 1917.5 MiB)
20:14:00.362 INFO MemoryStore - Block broadcast_109_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.5 MiB)
20:14:00.362 INFO BlockManagerInfo - Added broadcast_109_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.4 MiB)
20:14:00.362 INFO SparkContext - Created broadcast 109 from broadcast at CramSource.java:114
20:14:00.364 INFO MemoryStore - Block broadcast_110 stored as values in memory (estimated size 297.9 KiB, free 1917.2 MiB)
20:14:00.373 INFO BlockManagerInfo - Removed broadcast_103_piece0 on localhost:35739 in memory (size: 58.6 KiB, free: 1919.5 MiB)
20:14:00.374 INFO BlockManagerInfo - Removed broadcast_101_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.5 MiB)
20:14:00.375 INFO BlockManagerInfo - Removed broadcast_98_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:00.375 INFO BlockManagerInfo - Removed broadcast_105_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:00.376 INFO BlockManagerInfo - Removed broadcast_100_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.6 MiB)
20:14:00.376 INFO BlockManagerInfo - Removed broadcast_108_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.6 MiB)
20:14:00.377 INFO BlockManagerInfo - Removed broadcast_106_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.7 MiB)
20:14:00.378 INFO BlockManagerInfo - Removed broadcast_102_piece0 on localhost:35739 in memory (size: 157.6 KiB, free: 1919.8 MiB)
20:14:00.379 INFO BlockManagerInfo - Removed broadcast_107_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1920.0 MiB)
20:14:00.380 INFO MemoryStore - Block broadcast_110_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.7 MiB)
20:14:00.380 INFO BlockManagerInfo - Removed broadcast_104_piece0 on localhost:35739 in memory (size: 231.0 B, free: 1920.0 MiB)
20:14:00.380 INFO BlockManagerInfo - Added broadcast_110_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1920.0 MiB)
20:14:00.381 INFO SparkContext - Created broadcast 110 from newAPIHadoopFile at PathSplitSource.java:96
20:14:00.404 INFO MemoryStore - Block broadcast_111 stored as values in memory (estimated size 576.0 B, free 1919.7 MiB)
20:14:00.405 INFO MemoryStore - Block broadcast_111_piece0 stored as bytes in memory (estimated size 228.0 B, free 1919.7 MiB)
20:14:00.405 INFO BlockManagerInfo - Added broadcast_111_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1920.0 MiB)
20:14:00.405 INFO SparkContext - Created broadcast 111 from broadcast at CramSource.java:114
20:14:00.410 INFO MemoryStore - Block broadcast_112 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
20:14:00.419 INFO MemoryStore - Block broadcast_112_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:14:00.419 INFO BlockManagerInfo - Added broadcast_112_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:00.419 INFO SparkContext - Created broadcast 112 from newAPIHadoopFile at PathSplitSource.java:96
20:14:00.437 INFO FileInputFormat - Total input files to process : 1
20:14:00.438 INFO MemoryStore - Block broadcast_113 stored as values in memory (estimated size 6.0 KiB, free 1919.3 MiB)
20:14:00.438 INFO MemoryStore - Block broadcast_113_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
20:14:00.439 INFO BlockManagerInfo - Added broadcast_113_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.9 MiB)
20:14:00.439 INFO SparkContext - Created broadcast 113 from broadcast at ReadsSparkSink.java:133
20:14:00.440 INFO MemoryStore - Block broadcast_114 stored as values in memory (estimated size 6.2 KiB, free 1919.3 MiB)
20:14:00.440 INFO MemoryStore - Block broadcast_114_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
20:14:00.440 INFO BlockManagerInfo - Added broadcast_114_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.9 MiB)
20:14:00.440 INFO SparkContext - Created broadcast 114 from broadcast at CramSink.java:76
20:14:00.444 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts dst=null perm=null proto=rpc
20:14:00.445 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:00.445 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:00.445 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:00.446 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:00.452 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:00.453 INFO DAGScheduler - Registering RDD 252 (mapToPair at SparkUtils.java:161) as input to shuffle 14
20:14:00.453 INFO DAGScheduler - Got job 48 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:00.453 INFO DAGScheduler - Final stage: ResultStage 65 (runJob at SparkHadoopWriter.scala:83)
20:14:00.453 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 64)
20:14:00.453 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 64)
20:14:00.453 INFO DAGScheduler - Submitting ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:00.471 INFO MemoryStore - Block broadcast_115 stored as values in memory (estimated size 292.8 KiB, free 1919.0 MiB)
20:14:00.472 INFO MemoryStore - Block broadcast_115_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1918.9 MiB)
20:14:00.473 INFO BlockManagerInfo - Added broadcast_115_piece0 in memory on localhost:35739 (size: 107.3 KiB, free: 1919.8 MiB)
20:14:00.473 INFO SparkContext - Created broadcast 115 from broadcast at DAGScheduler.scala:1580
20:14:00.473 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:00.473 INFO TaskSchedulerImpl - Adding task set 64.0 with 1 tasks resource profile 0
20:14:00.474 INFO TaskSetManager - Starting task 0.0 in stage 64.0 (TID 102) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
20:14:00.474 INFO Executor - Running task 0.0 in stage 64.0 (TID 102)
20:14:00.497 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:00.526 INFO Executor - Finished task 0.0 in stage 64.0 (TID 102). 1148 bytes result sent to driver
20:14:00.527 INFO TaskSetManager - Finished task 0.0 in stage 64.0 (TID 102) in 53 ms on localhost (executor driver) (1/1)
20:14:00.527 INFO TaskSchedulerImpl - Removed TaskSet 64.0, whose tasks have all completed, from pool
20:14:00.527 INFO DAGScheduler - ShuffleMapStage 64 (mapToPair at SparkUtils.java:161) finished in 0.073 s
20:14:00.527 INFO DAGScheduler - looking for newly runnable stages
20:14:00.527 INFO DAGScheduler - running: HashSet()
20:14:00.527 INFO DAGScheduler - waiting: HashSet(ResultStage 65)
20:14:00.527 INFO DAGScheduler - failed: HashSet()
20:14:00.527 INFO DAGScheduler - Submitting ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89), which has no missing parents
20:14:00.534 INFO MemoryStore - Block broadcast_116 stored as values in memory (estimated size 153.3 KiB, free 1918.8 MiB)
20:14:00.535 INFO MemoryStore - Block broadcast_116_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1918.7 MiB)
20:14:00.535 INFO BlockManagerInfo - Added broadcast_116_piece0 in memory on localhost:35739 (size: 58.1 KiB, free: 1919.7 MiB)
20:14:00.535 INFO SparkContext - Created broadcast 116 from broadcast at DAGScheduler.scala:1580
20:14:00.536 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
20:14:00.536 INFO TaskSchedulerImpl - Adding task set 65.0 with 1 tasks resource profile 0
20:14:00.536 INFO TaskSetManager - Starting task 0.0 in stage 65.0 (TID 103) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:00.537 INFO Executor - Running task 0.0 in stage 65.0 (TID 103)
20:14:00.542 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:00.542 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:00.549 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:00.549 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:00.549 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:00.549 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:00.549 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:00.549 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:00.553 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/_temporary/attempt_202502102014005804900975687316016_0257_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:00.665 INFO StateChange - BLOCK* allocate blk_1073741866_1042, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/_temporary/attempt_202502102014005804900975687316016_0257_r_000000_0/part-r-00000
20:14:00.666 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741866_1042 src: /127.0.0.1:37810 dest: /127.0.0.1:38353
20:14:00.667 INFO clienttrace - src: /127.0.0.1:37810, dest: /127.0.0.1:38353, bytes: 42661, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741866_1042, duration(ns): 578699
20:14:00.667 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741866_1042, type=LAST_IN_PIPELINE terminating
20:14:00.668 INFO FSNamesystem - BLOCK* blk_1073741866_1042 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/_temporary/attempt_202502102014005804900975687316016_0257_r_000000_0/part-r-00000
20:14:01.048 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741860_1036 replica FinalizedReplica, blk_1073741860_1036, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741860 for deletion
20:14:01.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741861_1037 replica FinalizedReplica, blk_1073741861_1037, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741861 for deletion
20:14:01.049 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741860_1036 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741860
20:14:01.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741854_1030 replica FinalizedReplica, blk_1073741854_1030, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741854 for deletion
20:14:01.049 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741861_1037 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741861
20:14:01.049 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741854_1030 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741854
20:14:01.069 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/_temporary/attempt_202502102014005804900975687316016_0257_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:01.070 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/_temporary/attempt_202502102014005804900975687316016_0257_r_000000_0 dst=null perm=null proto=rpc
20:14:01.071 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/_temporary/attempt_202502102014005804900975687316016_0257_r_000000_0 dst=null perm=null proto=rpc
20:14:01.071 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/task_202502102014005804900975687316016_0257_r_000000 dst=null perm=null proto=rpc
20:14:01.072 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/_temporary/attempt_202502102014005804900975687316016_0257_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/task_202502102014005804900975687316016_0257_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:01.072 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014005804900975687316016_0257_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/task_202502102014005804900975687316016_0257_r_000000
20:14:01.072 INFO SparkHadoopMapRedUtil - attempt_202502102014005804900975687316016_0257_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:01.073 INFO Executor - Finished task 0.0 in stage 65.0 (TID 103). 1858 bytes result sent to driver
20:14:01.073 INFO TaskSetManager - Finished task 0.0 in stage 65.0 (TID 103) in 537 ms on localhost (executor driver) (1/1)
20:14:01.073 INFO TaskSchedulerImpl - Removed TaskSet 65.0, whose tasks have all completed, from pool
20:14:01.074 INFO DAGScheduler - ResultStage 65 (runJob at SparkHadoopWriter.scala:83) finished in 0.546 s
20:14:01.074 INFO DAGScheduler - Job 48 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:01.074 INFO TaskSchedulerImpl - Killing all running tasks in stage 65: Stage finished
20:14:01.074 INFO DAGScheduler - Job 48 finished: runJob at SparkHadoopWriter.scala:83, took 0.621717 s
20:14:01.074 INFO SparkHadoopWriter - Start to commit write Job job_202502102014005804900975687316016_0257.
20:14:01.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:01.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts dst=null perm=null proto=rpc
20:14:01.076 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/task_202502102014005804900975687316016_0257_r_000000 dst=null perm=null proto=rpc
20:14:01.076 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:01.077 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary/0/task_202502102014005804900975687316016_0257_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:01.078 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_temporary dst=null perm=null proto=rpc
20:14:01.078 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:01.079 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:01.080 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/.spark-staging-257 dst=null perm=null proto=rpc
20:14:01.080 INFO SparkHadoopWriter - Write Job job_202502102014005804900975687316016_0257 committed. Elapsed time: 5 ms.
20:14:01.081 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:01.083 INFO StateChange - BLOCK* allocate blk_1073741867_1043, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/header
20:14:01.084 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741867_1043 src: /127.0.0.1:37826 dest: /127.0.0.1:38353
20:14:01.085 INFO clienttrace - src: /127.0.0.1:37826, dest: /127.0.0.1:38353, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741867_1043, duration(ns): 412272
20:14:01.085 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741867_1043, type=LAST_IN_PIPELINE terminating
20:14:01.085 INFO FSNamesystem - BLOCK* blk_1073741867_1043 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/header
20:14:01.486 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:01.487 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:01.488 INFO StateChange - BLOCK* allocate blk_1073741868_1044, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/terminator
20:14:01.489 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741868_1044 src: /127.0.0.1:37836 dest: /127.0.0.1:38353
20:14:01.490 INFO clienttrace - src: /127.0.0.1:37836, dest: /127.0.0.1:38353, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741868_1044, duration(ns): 464765
20:14:01.491 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741868_1044, type=LAST_IN_PIPELINE terminating
20:14:01.491 INFO FSNamesystem - BLOCK* blk_1073741868_1044 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/terminator
20:14:01.892 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:01.893 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts dst=null perm=null proto=rpc
20:14:01.894 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:01.895 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:01.895 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram
20:14:01.896 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:01.896 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.896 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:01.897 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram done
20:14:01.897 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.parts dst=null perm=null proto=rpc
20:14:01.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.900 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.crai dst=null perm=null proto=rpc
20:14:01.900 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.crai dst=null perm=null proto=rpc
20:14:01.903 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:01.903 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:01.904 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:01.904 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.905 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.905 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.crai dst=null perm=null proto=rpc
20:14:01.905 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.crai dst=null perm=null proto=rpc
20:14:01.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.907 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:01.908 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:01.908 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:01.909 INFO MemoryStore - Block broadcast_117 stored as values in memory (estimated size 528.0 B, free 1918.7 MiB)
20:14:01.909 INFO MemoryStore - Block broadcast_117_piece0 stored as bytes in memory (estimated size 187.0 B, free 1918.7 MiB)
20:14:01.909 INFO BlockManagerInfo - Added broadcast_117_piece0 in memory on localhost:35739 (size: 187.0 B, free: 1919.7 MiB)
20:14:01.910 INFO SparkContext - Created broadcast 117 from broadcast at CramSource.java:114
20:14:01.911 INFO MemoryStore - Block broadcast_118 stored as values in memory (estimated size 297.9 KiB, free 1918.4 MiB)
20:14:01.917 INFO MemoryStore - Block broadcast_118_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
20:14:01.917 INFO BlockManagerInfo - Added broadcast_118_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:01.918 INFO SparkContext - Created broadcast 118 from newAPIHadoopFile at PathSplitSource.java:96
20:14:01.933 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.933 INFO FileInputFormat - Total input files to process : 1
20:14:01.933 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:01.962 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:01.963 INFO DAGScheduler - Got job 49 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:01.963 INFO DAGScheduler - Final stage: ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:01.963 INFO DAGScheduler - Parents of final stage: List()
20:14:01.963 INFO DAGScheduler - Missing parents: List()
20:14:01.963 INFO DAGScheduler - Submitting ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:01.974 INFO MemoryStore - Block broadcast_119 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
20:14:01.982 INFO BlockManagerInfo - Removed broadcast_114_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.7 MiB)
20:14:01.982 INFO MemoryStore - Block broadcast_119_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
20:14:01.982 INFO BlockManagerInfo - Added broadcast_119_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.6 MiB)
20:14:01.982 INFO SparkContext - Created broadcast 119 from broadcast at DAGScheduler.scala:1580
20:14:01.982 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:01.983 INFO TaskSchedulerImpl - Adding task set 66.0 with 1 tasks resource profile 0
20:14:01.983 INFO BlockManagerInfo - Removed broadcast_115_piece0 on localhost:35739 in memory (size: 107.3 KiB, free: 1919.7 MiB)
20:14:01.983 INFO TaskSetManager - Starting task 0.0 in stage 66.0 (TID 104) (localhost, executor driver, partition 0, ANY, 7853 bytes)
20:14:01.984 INFO Executor - Running task 0.0 in stage 66.0 (TID 104)
20:14:01.984 INFO BlockManagerInfo - Removed broadcast_113_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.7 MiB)
20:14:01.985 INFO BlockManagerInfo - Removed broadcast_116_piece0 on localhost:35739 in memory (size: 58.1 KiB, free: 1919.8 MiB)
20:14:01.985 INFO BlockManagerInfo - Removed broadcast_111_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.8 MiB)
20:14:01.986 INFO BlockManagerInfo - Removed broadcast_112_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:02.006 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram:0+43715
20:14:02.007 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:02.007 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:02.008 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.crai dst=null perm=null proto=rpc
20:14:02.009 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.crai dst=null perm=null proto=rpc
20:14:02.011 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:02.012 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:02.012 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:02.050 INFO Executor - Finished task 0.0 in stage 66.0 (TID 104). 154101 bytes result sent to driver
20:14:02.051 INFO TaskSetManager - Finished task 0.0 in stage 66.0 (TID 104) in 68 ms on localhost (executor driver) (1/1)
20:14:02.051 INFO DAGScheduler - ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.088 s
20:14:02.052 INFO DAGScheduler - Job 49 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:02.052 INFO TaskSchedulerImpl - Removed TaskSet 66.0, whose tasks have all completed, from pool
20:14:02.052 INFO TaskSchedulerImpl - Killing all running tasks in stage 66: Stage finished
20:14:02.052 INFO DAGScheduler - Job 49 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.089666 s
20:14:02.060 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:02.061 INFO DAGScheduler - Got job 50 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:02.061 INFO DAGScheduler - Final stage: ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185)
20:14:02.061 INFO DAGScheduler - Parents of final stage: List()
20:14:02.061 INFO DAGScheduler - Missing parents: List()
20:14:02.061 INFO DAGScheduler - Submitting ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:02.082 INFO MemoryStore - Block broadcast_120 stored as values in memory (estimated size 286.8 KiB, free 1918.7 MiB)
20:14:02.083 INFO MemoryStore - Block broadcast_120_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.6 MiB)
20:14:02.083 INFO BlockManagerInfo - Added broadcast_120_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.7 MiB)
20:14:02.084 INFO SparkContext - Created broadcast 120 from broadcast at DAGScheduler.scala:1580
20:14:02.084 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:02.084 INFO TaskSchedulerImpl - Adding task set 67.0 with 1 tasks resource profile 0
20:14:02.084 INFO TaskSetManager - Starting task 0.0 in stage 67.0 (TID 105) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
20:14:02.085 INFO Executor - Running task 0.0 in stage 67.0 (TID 105)
20:14:02.105 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:02.114 INFO Executor - Finished task 0.0 in stage 67.0 (TID 105). 989 bytes result sent to driver
20:14:02.114 INFO TaskSetManager - Finished task 0.0 in stage 67.0 (TID 105) in 30 ms on localhost (executor driver) (1/1)
20:14:02.115 INFO TaskSchedulerImpl - Removed TaskSet 67.0, whose tasks have all completed, from pool
20:14:02.115 INFO DAGScheduler - ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.054 s
20:14:02.115 INFO DAGScheduler - Job 50 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:02.115 INFO TaskSchedulerImpl - Killing all running tasks in stage 67: Stage finished
20:14:02.115 INFO DAGScheduler - Job 50 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.054810 s
20:14:02.119 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:02.119 INFO DAGScheduler - Got job 51 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:02.119 INFO DAGScheduler - Final stage: ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185)
20:14:02.119 INFO DAGScheduler - Parents of final stage: List()
20:14:02.119 INFO DAGScheduler - Missing parents: List()
20:14:02.119 INFO DAGScheduler - Submitting ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:02.131 INFO MemoryStore - Block broadcast_121 stored as values in memory (estimated size 286.8 KiB, free 1918.3 MiB)
20:14:02.132 INFO MemoryStore - Block broadcast_121_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.2 MiB)
20:14:02.132 INFO BlockManagerInfo - Added broadcast_121_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.6 MiB)
20:14:02.132 INFO SparkContext - Created broadcast 121 from broadcast at DAGScheduler.scala:1580
20:14:02.132 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:02.132 INFO TaskSchedulerImpl - Adding task set 68.0 with 1 tasks resource profile 0
20:14:02.133 INFO TaskSetManager - Starting task 0.0 in stage 68.0 (TID 106) (localhost, executor driver, partition 0, ANY, 7853 bytes)
20:14:02.134 INFO Executor - Running task 0.0 in stage 68.0 (TID 106)
20:14:02.159 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram:0+43715
20:14:02.160 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:02.161 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram dst=null perm=null proto=rpc
20:14:02.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.cram.crai dst=null perm=null proto=rpc
20:14:02.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_3ce10158-2d72-438a-927a-cfd2ce87ac2e.crai dst=null perm=null proto=rpc
20:14:02.164 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:02.165 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:02.165 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:02.190 INFO Executor - Finished task 0.0 in stage 68.0 (TID 106). 989 bytes result sent to driver
20:14:02.190 INFO TaskSetManager - Finished task 0.0 in stage 68.0 (TID 106) in 57 ms on localhost (executor driver) (1/1)
20:14:02.190 INFO TaskSchedulerImpl - Removed TaskSet 68.0, whose tasks have all completed, from pool
20:14:02.190 INFO DAGScheduler - ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.071 s
20:14:02.190 INFO DAGScheduler - Job 51 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:02.190 INFO TaskSchedulerImpl - Killing all running tasks in stage 68: Stage finished
20:14:02.190 INFO DAGScheduler - Job 51 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.071851 s
20:14:02.194 INFO MemoryStore - Block broadcast_122 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
20:14:02.201 INFO MemoryStore - Block broadcast_122_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
20:14:02.201 INFO BlockManagerInfo - Added broadcast_122_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:02.201 INFO SparkContext - Created broadcast 122 from newAPIHadoopFile at PathSplitSource.java:96
20:14:02.223 INFO MemoryStore - Block broadcast_123 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
20:14:02.229 INFO MemoryStore - Block broadcast_123_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
20:14:02.230 INFO BlockManagerInfo - Added broadcast_123_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:02.230 INFO SparkContext - Created broadcast 123 from newAPIHadoopFile at PathSplitSource.java:96
20:14:02.250 INFO FileInputFormat - Total input files to process : 1
20:14:02.252 INFO MemoryStore - Block broadcast_124 stored as values in memory (estimated size 160.7 KiB, free 1917.3 MiB)
20:14:02.253 INFO MemoryStore - Block broadcast_124_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
20:14:02.253 INFO BlockManagerInfo - Added broadcast_124_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:14:02.254 INFO SparkContext - Created broadcast 124 from broadcast at ReadsSparkSink.java:133
20:14:02.264 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts dst=null perm=null proto=rpc
20:14:02.265 INFO deprecation - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
20:14:02.266 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:02.266 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:02.266 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:02.267 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:02.274 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:02.275 INFO DAGScheduler - Registering RDD 277 (mapToPair at SparkUtils.java:161) as input to shuffle 15
20:14:02.275 INFO DAGScheduler - Got job 52 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:02.275 INFO DAGScheduler - Final stage: ResultStage 70 (runJob at SparkHadoopWriter.scala:83)
20:14:02.275 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 69)
20:14:02.275 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 69)
20:14:02.275 INFO DAGScheduler - Submitting ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:02.293 INFO MemoryStore - Block broadcast_125 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
20:14:02.294 INFO MemoryStore - Block broadcast_125_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
20:14:02.295 INFO BlockManagerInfo - Added broadcast_125_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.3 MiB)
20:14:02.295 INFO SparkContext - Created broadcast 125 from broadcast at DAGScheduler.scala:1580
20:14:02.295 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:02.295 INFO TaskSchedulerImpl - Adding task set 69.0 with 1 tasks resource profile 0
20:14:02.296 INFO TaskSetManager - Starting task 0.0 in stage 69.0 (TID 107) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:02.296 INFO Executor - Running task 0.0 in stage 69.0 (TID 107)
20:14:02.326 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:02.344 INFO Executor - Finished task 0.0 in stage 69.0 (TID 107). 1148 bytes result sent to driver
20:14:02.345 INFO TaskSetManager - Finished task 0.0 in stage 69.0 (TID 107) in 49 ms on localhost (executor driver) (1/1)
20:14:02.345 INFO TaskSchedulerImpl - Removed TaskSet 69.0, whose tasks have all completed, from pool
20:14:02.345 INFO DAGScheduler - ShuffleMapStage 69 (mapToPair at SparkUtils.java:161) finished in 0.069 s
20:14:02.346 INFO DAGScheduler - looking for newly runnable stages
20:14:02.346 INFO DAGScheduler - running: HashSet()
20:14:02.346 INFO DAGScheduler - waiting: HashSet(ResultStage 70)
20:14:02.346 INFO DAGScheduler - failed: HashSet()
20:14:02.346 INFO DAGScheduler - Submitting ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65), which has no missing parents
20:14:02.357 INFO MemoryStore - Block broadcast_126 stored as values in memory (estimated size 241.1 KiB, free 1916.4 MiB)
20:14:02.358 INFO MemoryStore - Block broadcast_126_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.4 MiB)
20:14:02.358 INFO BlockManagerInfo - Added broadcast_126_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.3 MiB)
20:14:02.359 INFO SparkContext - Created broadcast 126 from broadcast at DAGScheduler.scala:1580
20:14:02.359 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
20:14:02.359 INFO TaskSchedulerImpl - Adding task set 70.0 with 1 tasks resource profile 0
20:14:02.359 INFO TaskSetManager - Starting task 0.0 in stage 70.0 (TID 108) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:02.360 INFO Executor - Running task 0.0 in stage 70.0 (TID 108)
20:14:02.365 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:02.365 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:02.380 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:02.380 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:02.380 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:02.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/_temporary/attempt_202502102014027301068996661199670_0283_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:02.384 INFO StateChange - BLOCK* allocate blk_1073741869_1045, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/_temporary/attempt_202502102014027301068996661199670_0283_m_000000_0/part-00000
20:14:02.385 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741869_1045 src: /127.0.0.1:37844 dest: /127.0.0.1:38353
20:14:02.394 INFO clienttrace - src: /127.0.0.1:37844, dest: /127.0.0.1:38353, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741869_1045, duration(ns): 8179549
20:14:02.394 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741869_1045, type=LAST_IN_PIPELINE terminating
20:14:02.395 INFO FSNamesystem - BLOCK* blk_1073741869_1045 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/_temporary/attempt_202502102014027301068996661199670_0283_m_000000_0/part-00000
20:14:02.796 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/_temporary/attempt_202502102014027301068996661199670_0283_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:02.797 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/_temporary/attempt_202502102014027301068996661199670_0283_m_000000_0 dst=null perm=null proto=rpc
20:14:02.798 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/_temporary/attempt_202502102014027301068996661199670_0283_m_000000_0 dst=null perm=null proto=rpc
20:14:02.798 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/task_202502102014027301068996661199670_0283_m_000000 dst=null perm=null proto=rpc
20:14:02.799 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/_temporary/attempt_202502102014027301068996661199670_0283_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/task_202502102014027301068996661199670_0283_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:02.799 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014027301068996661199670_0283_m_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/task_202502102014027301068996661199670_0283_m_000000
20:14:02.799 INFO SparkHadoopMapRedUtil - attempt_202502102014027301068996661199670_0283_m_000000_0: Committed. Elapsed time: 1 ms.
20:14:02.800 INFO Executor - Finished task 0.0 in stage 70.0 (TID 108). 1858 bytes result sent to driver
20:14:02.800 INFO TaskSetManager - Finished task 0.0 in stage 70.0 (TID 108) in 441 ms on localhost (executor driver) (1/1)
20:14:02.800 INFO TaskSchedulerImpl - Removed TaskSet 70.0, whose tasks have all completed, from pool
20:14:02.800 INFO DAGScheduler - ResultStage 70 (runJob at SparkHadoopWriter.scala:83) finished in 0.454 s
20:14:02.801 INFO DAGScheduler - Job 52 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:02.801 INFO TaskSchedulerImpl - Killing all running tasks in stage 70: Stage finished
20:14:02.801 INFO DAGScheduler - Job 52 finished: runJob at SparkHadoopWriter.scala:83, took 0.526436 s
20:14:02.801 INFO SparkHadoopWriter - Start to commit write Job job_202502102014027301068996661199670_0283.
20:14:02.802 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:02.802 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts dst=null perm=null proto=rpc
20:14:02.803 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/task_202502102014027301068996661199670_0283_m_000000 dst=null perm=null proto=rpc
20:14:02.803 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/part-00000 dst=null perm=null proto=rpc
20:14:02.804 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary/0/task_202502102014027301068996661199670_0283_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:02.805 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_temporary dst=null perm=null proto=rpc
20:14:02.805 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:02.806 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:02.807 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/.spark-staging-283 dst=null perm=null proto=rpc
20:14:02.807 INFO SparkHadoopWriter - Write Job job_202502102014027301068996661199670_0283 committed. Elapsed time: 5 ms.
20:14:02.807 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:02.810 INFO StateChange - BLOCK* allocate blk_1073741870_1046, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/header
20:14:02.811 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741870_1046 src: /127.0.0.1:37856 dest: /127.0.0.1:38353
20:14:02.812 INFO clienttrace - src: /127.0.0.1:37856, dest: /127.0.0.1:38353, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741870_1046, duration(ns): 720398
20:14:02.813 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741870_1046, type=LAST_IN_PIPELINE terminating
20:14:02.813 INFO FSNamesystem - BLOCK* blk_1073741870_1046 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/header
20:14:03.214 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:03.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts dst=null perm=null proto=rpc
20:14:03.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:03.217 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:03.217 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam
20:14:03.218 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:03.218 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.219 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:03.219 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam done
20:14:03.219 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam.parts dst=null perm=null proto=rpc
20:14:03.220 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.220 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.221 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.221 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
WARNING 2025-02-10 20:14:03 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:03.223 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
20:14:03.225 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.225 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.226 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
WARNING 2025-02-10 20:14:03 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:03.227 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
20:14:03.229 INFO MemoryStore - Block broadcast_127 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
20:14:03.231 INFO MemoryStore - Block broadcast_127_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
20:14:03.231 INFO BlockManagerInfo - Added broadcast_127_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:03.231 INFO SparkContext - Created broadcast 127 from broadcast at SamSource.java:78
20:14:03.232 INFO MemoryStore - Block broadcast_128 stored as values in memory (estimated size 297.9 KiB, free 1915.9 MiB)
20:14:03.242 INFO BlockManagerInfo - Removed broadcast_125_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.4 MiB)
20:14:03.242 INFO BlockManagerInfo - Removed broadcast_120_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.5 MiB)
20:14:03.243 INFO BlockManagerInfo - Removed broadcast_126_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.6 MiB)
20:14:03.244 INFO BlockManagerInfo - Removed broadcast_119_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.7 MiB)
20:14:03.244 INFO BlockManagerInfo - Removed broadcast_121_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.8 MiB)
20:14:03.245 INFO BlockManagerInfo - Removed broadcast_109_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.8 MiB)
20:14:03.246 INFO BlockManagerInfo - Removed broadcast_124_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:03.246 INFO BlockManagerInfo - Removed broadcast_123_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:03.247 INFO BlockManagerInfo - Removed broadcast_117_piece0 on localhost:35739 in memory (size: 187.0 B, free: 1919.8 MiB)
20:14:03.248 INFO BlockManagerInfo - Removed broadcast_110_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:03.248 INFO BlockManagerInfo - Removed broadcast_118_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:03.249 INFO MemoryStore - Block broadcast_128_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.2 MiB)
20:14:03.250 INFO BlockManagerInfo - Added broadcast_128_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:03.250 INFO SparkContext - Created broadcast 128 from newAPIHadoopFile at SamSource.java:108
20:14:03.259 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.259 INFO FileInputFormat - Total input files to process : 1
20:14:03.260 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.272 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:03.273 INFO DAGScheduler - Got job 53 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:03.273 INFO DAGScheduler - Final stage: ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:03.273 INFO DAGScheduler - Parents of final stage: List()
20:14:03.273 INFO DAGScheduler - Missing parents: List()
20:14:03.273 INFO DAGScheduler - Submitting ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:03.274 INFO MemoryStore - Block broadcast_129 stored as values in memory (estimated size 7.5 KiB, free 1919.1 MiB)
20:14:03.274 INFO MemoryStore - Block broadcast_129_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1919.1 MiB)
20:14:03.274 INFO BlockManagerInfo - Added broadcast_129_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.9 MiB)
20:14:03.275 INFO SparkContext - Created broadcast 129 from broadcast at DAGScheduler.scala:1580
20:14:03.275 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:03.275 INFO TaskSchedulerImpl - Adding task set 71.0 with 1 tasks resource profile 0
20:14:03.276 INFO TaskSetManager - Starting task 0.0 in stage 71.0 (TID 109) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:03.276 INFO Executor - Running task 0.0 in stage 71.0 (TID 109)
20:14:03.277 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam:0+847558
20:14:03.282 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.319 INFO Executor - Finished task 0.0 in stage 71.0 (TID 109). 651526 bytes result sent to driver
20:14:03.321 INFO TaskSetManager - Finished task 0.0 in stage 71.0 (TID 109) in 46 ms on localhost (executor driver) (1/1)
20:14:03.321 INFO TaskSchedulerImpl - Removed TaskSet 71.0, whose tasks have all completed, from pool
20:14:03.321 INFO DAGScheduler - ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.048 s
20:14:03.321 INFO DAGScheduler - Job 53 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:03.321 INFO TaskSchedulerImpl - Killing all running tasks in stage 71: Stage finished
20:14:03.321 INFO DAGScheduler - Job 53 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.048867 s
20:14:03.337 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:03.337 INFO DAGScheduler - Got job 54 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:03.337 INFO DAGScheduler - Final stage: ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185)
20:14:03.337 INFO DAGScheduler - Parents of final stage: List()
20:14:03.337 INFO DAGScheduler - Missing parents: List()
20:14:03.337 INFO DAGScheduler - Submitting ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:03.355 INFO MemoryStore - Block broadcast_130 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
20:14:03.356 INFO MemoryStore - Block broadcast_130_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
20:14:03.356 INFO BlockManagerInfo - Added broadcast_130_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:14:03.356 INFO SparkContext - Created broadcast 130 from broadcast at DAGScheduler.scala:1580
20:14:03.356 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:03.356 INFO TaskSchedulerImpl - Adding task set 72.0 with 1 tasks resource profile 0
20:14:03.357 INFO TaskSetManager - Starting task 0.0 in stage 72.0 (TID 110) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:03.357 INFO Executor - Running task 0.0 in stage 72.0 (TID 110)
20:14:03.387 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:03.397 INFO Executor - Finished task 0.0 in stage 72.0 (TID 110). 989 bytes result sent to driver
20:14:03.398 INFO TaskSetManager - Finished task 0.0 in stage 72.0 (TID 110) in 41 ms on localhost (executor driver) (1/1)
20:14:03.398 INFO TaskSchedulerImpl - Removed TaskSet 72.0, whose tasks have all completed, from pool
20:14:03.398 INFO DAGScheduler - ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
20:14:03.398 INFO DAGScheduler - Job 54 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:03.398 INFO TaskSchedulerImpl - Killing all running tasks in stage 72: Stage finished
20:14:03.398 INFO DAGScheduler - Job 54 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.061505 s
20:14:03.402 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:03.402 INFO DAGScheduler - Got job 55 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:03.402 INFO DAGScheduler - Final stage: ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185)
20:14:03.402 INFO DAGScheduler - Parents of final stage: List()
20:14:03.403 INFO DAGScheduler - Missing parents: List()
20:14:03.403 INFO DAGScheduler - Submitting ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:03.404 INFO MemoryStore - Block broadcast_131 stored as values in memory (estimated size 7.4 KiB, free 1918.6 MiB)
20:14:03.404 INFO MemoryStore - Block broadcast_131_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.6 MiB)
20:14:03.404 INFO BlockManagerInfo - Added broadcast_131_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.7 MiB)
20:14:03.404 INFO SparkContext - Created broadcast 131 from broadcast at DAGScheduler.scala:1580
20:14:03.405 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:03.405 INFO TaskSchedulerImpl - Adding task set 73.0 with 1 tasks resource profile 0
20:14:03.405 INFO TaskSetManager - Starting task 0.0 in stage 73.0 (TID 111) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:03.406 INFO Executor - Running task 0.0 in stage 73.0 (TID 111)
20:14:03.408 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam:0+847558
20:14:03.409 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_466f6da2-d395-4669-85ec-25e96bfef7b6.sam dst=null perm=null proto=rpc
20:14:03.412 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
20:14:03.427 INFO Executor - Finished task 0.0 in stage 73.0 (TID 111). 989 bytes result sent to driver
20:14:03.427 INFO TaskSetManager - Finished task 0.0 in stage 73.0 (TID 111) in 22 ms on localhost (executor driver) (1/1)
20:14:03.427 INFO TaskSchedulerImpl - Removed TaskSet 73.0, whose tasks have all completed, from pool
20:14:03.427 INFO DAGScheduler - ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.024 s
20:14:03.427 INFO DAGScheduler - Job 55 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:03.427 INFO TaskSchedulerImpl - Killing all running tasks in stage 73: Stage finished
20:14:03.428 INFO DAGScheduler - Job 55 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.025365 s
20:14:03.431 INFO MemoryStore - Block broadcast_132 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:03.437 INFO MemoryStore - Block broadcast_132_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:03.438 INFO BlockManagerInfo - Added broadcast_132_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:03.438 INFO SparkContext - Created broadcast 132 from newAPIHadoopFile at PathSplitSource.java:96
20:14:03.463 INFO MemoryStore - Block broadcast_133 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
20:14:03.470 INFO MemoryStore - Block broadcast_133_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
20:14:03.470 INFO BlockManagerInfo - Added broadcast_133_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:03.470 INFO SparkContext - Created broadcast 133 from newAPIHadoopFile at PathSplitSource.java:96
20:14:03.493 INFO MemoryStore - Block broadcast_134 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
20:14:03.494 INFO MemoryStore - Block broadcast_134_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
20:14:03.494 INFO BlockManagerInfo - Added broadcast_134_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:03.494 INFO SparkContext - Created broadcast 134 from broadcast at ReadsSparkSink.java:133
20:14:03.496 INFO MemoryStore - Block broadcast_135 stored as values in memory (estimated size 163.2 KiB, free 1917.6 MiB)
20:14:03.497 INFO MemoryStore - Block broadcast_135_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.6 MiB)
20:14:03.497 INFO BlockManagerInfo - Added broadcast_135_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:03.497 INFO SparkContext - Created broadcast 135 from broadcast at AnySamSinkMultiple.java:80
20:14:03.502 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:03.502 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:03.502 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:03.516 INFO FileInputFormat - Total input files to process : 1
20:14:03.524 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:03.524 INFO DAGScheduler - Registering RDD 296 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 16
20:14:03.525 INFO DAGScheduler - Got job 56 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:03.525 INFO DAGScheduler - Final stage: ResultStage 75 (runJob at SparkHadoopWriter.scala:83)
20:14:03.525 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 74)
20:14:03.525 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 74)
20:14:03.525 INFO DAGScheduler - Submitting ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:03.543 INFO MemoryStore - Block broadcast_136 stored as values in memory (estimated size 427.7 KiB, free 1917.1 MiB)
20:14:03.544 INFO MemoryStore - Block broadcast_136_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.0 MiB)
20:14:03.544 INFO BlockManagerInfo - Added broadcast_136_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.5 MiB)
20:14:03.544 INFO SparkContext - Created broadcast 136 from broadcast at DAGScheduler.scala:1580
20:14:03.544 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:03.544 INFO TaskSchedulerImpl - Adding task set 74.0 with 1 tasks resource profile 0
20:14:03.545 INFO TaskSetManager - Starting task 0.0 in stage 74.0 (TID 112) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:03.546 INFO Executor - Running task 0.0 in stage 74.0 (TID 112)
20:14:03.576 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:03.595 INFO Executor - Finished task 0.0 in stage 74.0 (TID 112). 1149 bytes result sent to driver
20:14:03.595 INFO TaskSetManager - Finished task 0.0 in stage 74.0 (TID 112) in 50 ms on localhost (executor driver) (1/1)
20:14:03.596 INFO TaskSchedulerImpl - Removed TaskSet 74.0, whose tasks have all completed, from pool
20:14:03.596 INFO DAGScheduler - ShuffleMapStage 74 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.071 s
20:14:03.596 INFO DAGScheduler - looking for newly runnable stages
20:14:03.596 INFO DAGScheduler - running: HashSet()
20:14:03.596 INFO DAGScheduler - waiting: HashSet(ResultStage 75)
20:14:03.596 INFO DAGScheduler - failed: HashSet()
20:14:03.596 INFO DAGScheduler - Submitting ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:03.603 INFO MemoryStore - Block broadcast_137 stored as values in memory (estimated size 150.2 KiB, free 1916.8 MiB)
20:14:03.611 INFO MemoryStore - Block broadcast_137_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1916.8 MiB)
20:14:03.611 INFO BlockManagerInfo - Removed broadcast_129_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.5 MiB)
20:14:03.611 INFO BlockManagerInfo - Added broadcast_137_piece0 in memory on localhost:35739 (size: 56.3 KiB, free: 1919.4 MiB)
20:14:03.611 INFO SparkContext - Created broadcast 137 from broadcast at DAGScheduler.scala:1580
20:14:03.611 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:03.611 INFO TaskSchedulerImpl - Adding task set 75.0 with 2 tasks resource profile 0
20:14:03.612 INFO BlockManagerInfo - Removed broadcast_127_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:03.612 INFO BlockManagerInfo - Removed broadcast_131_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.4 MiB)
20:14:03.613 INFO TaskSetManager - Starting task 0.0 in stage 75.0 (TID 113) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:03.613 INFO TaskSetManager - Starting task 1.0 in stage 75.0 (TID 114) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:03.613 INFO Executor - Running task 1.0 in stage 75.0 (TID 114)
20:14:03.613 INFO Executor - Running task 0.0 in stage 75.0 (TID 113)
20:14:03.613 INFO BlockManagerInfo - Removed broadcast_122_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:03.615 INFO BlockManagerInfo - Removed broadcast_133_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:03.615 INFO BlockManagerInfo - Removed broadcast_130_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:03.616 INFO BlockManagerInfo - Removed broadcast_128_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:03.620 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:03.620 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:03.620 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:03.620 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:03.621 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:03.621 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:03.621 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:03.621 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:03.621 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:03.621 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:03.621 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:03.621 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:03.634 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:03.634 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:03.635 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:03.636 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:03.644 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014038469730301740916434_0308_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest118051931816202446210.bam/_temporary/0/task_202502102014038469730301740916434_0308_r_000001
20:14:03.644 INFO SparkHadoopMapRedUtil - attempt_202502102014038469730301740916434_0308_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:03.645 INFO Executor - Finished task 1.0 in stage 75.0 (TID 114). 1729 bytes result sent to driver
20:14:03.646 INFO TaskSetManager - Finished task 1.0 in stage 75.0 (TID 114) in 33 ms on localhost (executor driver) (1/2)
20:14:03.646 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014038469730301740916434_0308_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest118051931816202446210.bam/_temporary/0/task_202502102014038469730301740916434_0308_r_000000
20:14:03.646 INFO SparkHadoopMapRedUtil - attempt_202502102014038469730301740916434_0308_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:03.646 INFO Executor - Finished task 0.0 in stage 75.0 (TID 113). 1729 bytes result sent to driver
20:14:03.647 INFO TaskSetManager - Finished task 0.0 in stage 75.0 (TID 113) in 35 ms on localhost (executor driver) (2/2)
20:14:03.647 INFO TaskSchedulerImpl - Removed TaskSet 75.0, whose tasks have all completed, from pool
20:14:03.647 INFO DAGScheduler - ResultStage 75 (runJob at SparkHadoopWriter.scala:83) finished in 0.050 s
20:14:03.647 INFO DAGScheduler - Job 56 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:03.647 INFO TaskSchedulerImpl - Killing all running tasks in stage 75: Stage finished
20:14:03.648 INFO DAGScheduler - Job 56 finished: runJob at SparkHadoopWriter.scala:83, took 0.123496 s
20:14:03.648 INFO SparkHadoopWriter - Start to commit write Job job_202502102014038469730301740916434_0308.
20:14:03.655 INFO SparkHadoopWriter - Write Job job_202502102014038469730301740916434_0308 committed. Elapsed time: 6 ms.
20:14:03.658 INFO MemoryStore - Block broadcast_138 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:03.664 INFO MemoryStore - Block broadcast_138_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:03.664 INFO BlockManagerInfo - Added broadcast_138_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:03.664 INFO SparkContext - Created broadcast 138 from newAPIHadoopFile at PathSplitSource.java:96
20:14:03.688 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:03.688 INFO DAGScheduler - Got job 57 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:03.688 INFO DAGScheduler - Final stage: ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222)
20:14:03.688 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 76)
20:14:03.688 INFO DAGScheduler - Missing parents: List()
20:14:03.689 INFO DAGScheduler - Submitting ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:03.690 INFO MemoryStore - Block broadcast_139 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
20:14:03.690 INFO MemoryStore - Block broadcast_139_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
20:14:03.690 INFO BlockManagerInfo - Added broadcast_139_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.7 MiB)
20:14:03.690 INFO SparkContext - Created broadcast 139 from broadcast at DAGScheduler.scala:1580
20:14:03.691 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:03.691 INFO TaskSchedulerImpl - Adding task set 77.0 with 2 tasks resource profile 0
20:14:03.692 INFO TaskSetManager - Starting task 0.0 in stage 77.0 (TID 115) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:03.692 INFO TaskSetManager - Starting task 1.0 in stage 77.0 (TID 116) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:03.692 INFO Executor - Running task 0.0 in stage 77.0 (TID 115)
20:14:03.692 INFO Executor - Running task 1.0 in stage 77.0 (TID 116)
20:14:03.694 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:03.694 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:03.694 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:03.694 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:03.698 INFO Executor - Finished task 1.0 in stage 77.0 (TID 116). 1634 bytes result sent to driver
20:14:03.699 INFO TaskSetManager - Finished task 1.0 in stage 77.0 (TID 116) in 7 ms on localhost (executor driver) (1/2)
20:14:03.699 INFO Executor - Finished task 0.0 in stage 77.0 (TID 115). 1634 bytes result sent to driver
20:14:03.700 INFO TaskSetManager - Finished task 0.0 in stage 77.0 (TID 115) in 9 ms on localhost (executor driver) (2/2)
20:14:03.700 INFO TaskSchedulerImpl - Removed TaskSet 77.0, whose tasks have all completed, from pool
20:14:03.700 INFO DAGScheduler - ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.011 s
20:14:03.700 INFO DAGScheduler - Job 57 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:03.700 INFO TaskSchedulerImpl - Killing all running tasks in stage 77: Stage finished
20:14:03.700 INFO DAGScheduler - Job 57 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.012262 s
20:14:03.715 INFO FileInputFormat - Total input files to process : 2
20:14:03.718 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:03.718 INFO DAGScheduler - Got job 58 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:03.718 INFO DAGScheduler - Final stage: ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222)
20:14:03.718 INFO DAGScheduler - Parents of final stage: List()
20:14:03.718 INFO DAGScheduler - Missing parents: List()
20:14:03.719 INFO DAGScheduler - Submitting ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:03.743 INFO MemoryStore - Block broadcast_140 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:14:03.744 INFO MemoryStore - Block broadcast_140_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:14:03.744 INFO BlockManagerInfo - Added broadcast_140_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:03.744 INFO SparkContext - Created broadcast 140 from broadcast at DAGScheduler.scala:1580
20:14:03.745 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:03.745 INFO TaskSchedulerImpl - Adding task set 78.0 with 2 tasks resource profile 0
20:14:03.745 INFO TaskSetManager - Starting task 0.0 in stage 78.0 (TID 117) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
20:14:03.745 INFO TaskSetManager - Starting task 1.0 in stage 78.0 (TID 118) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
20:14:03.746 INFO Executor - Running task 0.0 in stage 78.0 (TID 117)
20:14:03.746 INFO Executor - Running task 1.0 in stage 78.0 (TID 118)
20:14:03.775 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest118051931816202446210.bam/part-r-00000.bam:0+132492
20:14:03.785 INFO Executor - Finished task 1.0 in stage 78.0 (TID 118). 989 bytes result sent to driver
20:14:03.786 INFO TaskSetManager - Finished task 1.0 in stage 78.0 (TID 118) in 41 ms on localhost (executor driver) (1/2)
20:14:03.791 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest118051931816202446210.bam/part-r-00001.bam:0+129330
20:14:03.804 INFO Executor - Finished task 0.0 in stage 78.0 (TID 117). 989 bytes result sent to driver
20:14:03.804 INFO TaskSetManager - Finished task 0.0 in stage 78.0 (TID 117) in 59 ms on localhost (executor driver) (2/2)
20:14:03.804 INFO TaskSchedulerImpl - Removed TaskSet 78.0, whose tasks have all completed, from pool
20:14:03.804 INFO DAGScheduler - ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.085 s
20:14:03.805 INFO DAGScheduler - Job 58 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:03.805 INFO TaskSchedulerImpl - Killing all running tasks in stage 78: Stage finished
20:14:03.805 INFO DAGScheduler - Job 58 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.086695 s
20:14:03.809 INFO MemoryStore - Block broadcast_141 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
20:14:03.817 INFO MemoryStore - Block broadcast_141_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
20:14:03.817 INFO BlockManagerInfo - Added broadcast_141_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:03.818 INFO SparkContext - Created broadcast 141 from newAPIHadoopFile at PathSplitSource.java:96
20:14:03.841 INFO MemoryStore - Block broadcast_142 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
20:14:03.847 INFO MemoryStore - Block broadcast_142_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
20:14:03.848 INFO BlockManagerInfo - Added broadcast_142_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:03.848 INFO SparkContext - Created broadcast 142 from newAPIHadoopFile at PathSplitSource.java:96
20:14:03.868 INFO MemoryStore - Block broadcast_143 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:14:03.869 INFO MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:14:03.869 INFO BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.4 MiB)
20:14:03.870 INFO SparkContext - Created broadcast 143 from broadcast at ReadsSparkSink.java:133
20:14:03.871 INFO MemoryStore - Block broadcast_144 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
20:14:03.872 INFO MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
20:14:03.872 INFO BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.4 MiB)
20:14:03.872 INFO SparkContext - Created broadcast 144 from broadcast at AnySamSinkMultiple.java:80
20:14:03.874 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:03.874 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:03.874 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:03.886 INFO FileInputFormat - Total input files to process : 1
20:14:03.893 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:03.893 INFO DAGScheduler - Registering RDD 323 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 17
20:14:03.893 INFO DAGScheduler - Got job 59 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:03.893 INFO DAGScheduler - Final stage: ResultStage 80 (runJob at SparkHadoopWriter.scala:83)
20:14:03.893 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 79)
20:14:03.893 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 79)
20:14:03.894 INFO DAGScheduler - Submitting ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:03.920 INFO MemoryStore - Block broadcast_145 stored as values in memory (estimated size 427.7 KiB, free 1916.2 MiB)
20:14:03.928 INFO BlockManagerInfo - Removed broadcast_134_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:03.928 INFO MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.8 MiB)
20:14:03.928 INFO BlockManagerInfo - Removed broadcast_136_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.6 MiB)
20:14:03.929 INFO BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.4 MiB)
20:14:03.929 INFO SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1580
20:14:03.929 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:03.929 INFO TaskSchedulerImpl - Adding task set 79.0 with 1 tasks resource profile 0
20:14:03.930 INFO BlockManagerInfo - Removed broadcast_137_piece0 on localhost:35739 in memory (size: 56.3 KiB, free: 1919.5 MiB)
20:14:03.930 INFO TaskSetManager - Starting task 0.0 in stage 79.0 (TID 119) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:03.930 INFO Executor - Running task 0.0 in stage 79.0 (TID 119)
20:14:03.930 INFO BlockManagerInfo - Removed broadcast_132_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:03.931 INFO BlockManagerInfo - Removed broadcast_139_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.5 MiB)
20:14:03.932 INFO BlockManagerInfo - Removed broadcast_140_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:03.933 INFO BlockManagerInfo - Removed broadcast_142_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:03.933 INFO BlockManagerInfo - Removed broadcast_138_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:03.934 INFO BlockManagerInfo - Removed broadcast_135_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:03.966 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:03.992 INFO Executor - Finished task 0.0 in stage 79.0 (TID 119). 1149 bytes result sent to driver
20:14:03.992 INFO TaskSetManager - Finished task 0.0 in stage 79.0 (TID 119) in 62 ms on localhost (executor driver) (1/1)
20:14:03.992 INFO TaskSchedulerImpl - Removed TaskSet 79.0, whose tasks have all completed, from pool
20:14:03.992 INFO DAGScheduler - ShuffleMapStage 79 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.098 s
20:14:03.992 INFO DAGScheduler - looking for newly runnable stages
20:14:03.992 INFO DAGScheduler - running: HashSet()
20:14:03.992 INFO DAGScheduler - waiting: HashSet(ResultStage 80)
20:14:03.992 INFO DAGScheduler - failed: HashSet()
20:14:03.993 INFO DAGScheduler - Submitting ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:04.004 INFO MemoryStore - Block broadcast_146 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
20:14:04.004 INFO MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1918.6 MiB)
20:14:04.005 INFO BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:35739 (size: 56.3 KiB, free: 1919.7 MiB)
20:14:04.005 INFO SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1580
20:14:04.005 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.005 INFO TaskSchedulerImpl - Adding task set 80.0 with 2 tasks resource profile 0
20:14:04.006 INFO TaskSetManager - Starting task 0.0 in stage 80.0 (TID 120) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:04.006 INFO TaskSetManager - Starting task 1.0 in stage 80.0 (TID 121) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:04.007 INFO Executor - Running task 0.0 in stage 80.0 (TID 120)
20:14:04.007 INFO Executor - Running task 1.0 in stage 80.0 (TID 121)
20:14:04.013 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.013 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.013 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.013 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.013 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.013 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.013 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.013 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.013 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.013 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.013 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.013 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.026 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.026 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.030 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.031 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.037 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014032093204671091793510_0335_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest112462478906273271266.bam/_temporary/0/task_202502102014032093204671091793510_0335_r_000000
20:14:04.037 INFO SparkHadoopMapRedUtil - attempt_202502102014032093204671091793510_0335_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:04.038 INFO Executor - Finished task 0.0 in stage 80.0 (TID 120). 1729 bytes result sent to driver
20:14:04.039 INFO TaskSetManager - Finished task 0.0 in stage 80.0 (TID 120) in 33 ms on localhost (executor driver) (1/2)
20:14:04.040 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014032093204671091793510_0335_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest112462478906273271266.bam/_temporary/0/task_202502102014032093204671091793510_0335_r_000001
20:14:04.040 INFO SparkHadoopMapRedUtil - attempt_202502102014032093204671091793510_0335_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:04.041 INFO Executor - Finished task 1.0 in stage 80.0 (TID 121). 1729 bytes result sent to driver
20:14:04.041 INFO TaskSetManager - Finished task 1.0 in stage 80.0 (TID 121) in 35 ms on localhost (executor driver) (2/2)
20:14:04.041 INFO TaskSchedulerImpl - Removed TaskSet 80.0, whose tasks have all completed, from pool
20:14:04.042 INFO DAGScheduler - ResultStage 80 (runJob at SparkHadoopWriter.scala:83) finished in 0.048 s
20:14:04.042 INFO DAGScheduler - Job 59 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.042 INFO TaskSchedulerImpl - Killing all running tasks in stage 80: Stage finished
20:14:04.042 INFO DAGScheduler - Job 59 finished: runJob at SparkHadoopWriter.scala:83, took 0.149094 s
20:14:04.042 INFO SparkHadoopWriter - Start to commit write Job job_202502102014032093204671091793510_0335.
20:14:04.049 INFO SparkHadoopWriter - Write Job job_202502102014032093204671091793510_0335 committed. Elapsed time: 6 ms.
20:14:04.052 INFO MemoryStore - Block broadcast_147 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:04.062 INFO MemoryStore - Block broadcast_147_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:04.063 INFO BlockManagerInfo - Added broadcast_147_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:04.063 INFO SparkContext - Created broadcast 147 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.092 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:04.093 INFO DAGScheduler - Got job 60 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:04.093 INFO DAGScheduler - Final stage: ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222)
20:14:04.093 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 81)
20:14:04.093 INFO DAGScheduler - Missing parents: List()
20:14:04.093 INFO DAGScheduler - Submitting ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:04.094 INFO MemoryStore - Block broadcast_148 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
20:14:04.094 INFO MemoryStore - Block broadcast_148_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
20:14:04.095 INFO BlockManagerInfo - Added broadcast_148_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.7 MiB)
20:14:04.095 INFO SparkContext - Created broadcast 148 from broadcast at DAGScheduler.scala:1580
20:14:04.095 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.095 INFO TaskSchedulerImpl - Adding task set 82.0 with 2 tasks resource profile 0
20:14:04.096 INFO TaskSetManager - Starting task 0.0 in stage 82.0 (TID 122) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:04.096 INFO TaskSetManager - Starting task 1.0 in stage 82.0 (TID 123) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:04.096 INFO Executor - Running task 0.0 in stage 82.0 (TID 122)
20:14:04.096 INFO Executor - Running task 1.0 in stage 82.0 (TID 123)
20:14:04.098 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.098 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.098 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.098 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.103 INFO Executor - Finished task 0.0 in stage 82.0 (TID 122). 1634 bytes result sent to driver
20:14:04.103 INFO TaskSetManager - Finished task 0.0 in stage 82.0 (TID 122) in 7 ms on localhost (executor driver) (1/2)
20:14:04.103 INFO Executor - Finished task 1.0 in stage 82.0 (TID 123). 1634 bytes result sent to driver
20:14:04.104 INFO TaskSetManager - Finished task 1.0 in stage 82.0 (TID 123) in 8 ms on localhost (executor driver) (2/2)
20:14:04.104 INFO TaskSchedulerImpl - Removed TaskSet 82.0, whose tasks have all completed, from pool
20:14:04.104 INFO DAGScheduler - ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.011 s
20:14:04.104 INFO DAGScheduler - Job 60 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.104 INFO TaskSchedulerImpl - Killing all running tasks in stage 82: Stage finished
20:14:04.104 INFO DAGScheduler - Job 60 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.012224 s
20:14:04.118 INFO FileInputFormat - Total input files to process : 2
20:14:04.121 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:04.121 INFO DAGScheduler - Got job 61 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:04.121 INFO DAGScheduler - Final stage: ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222)
20:14:04.122 INFO DAGScheduler - Parents of final stage: List()
20:14:04.122 INFO DAGScheduler - Missing parents: List()
20:14:04.122 INFO DAGScheduler - Submitting ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:04.139 INFO MemoryStore - Block broadcast_149 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:14:04.140 INFO MemoryStore - Block broadcast_149_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:14:04.140 INFO BlockManagerInfo - Added broadcast_149_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:04.141 INFO SparkContext - Created broadcast 149 from broadcast at DAGScheduler.scala:1580
20:14:04.141 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.141 INFO TaskSchedulerImpl - Adding task set 83.0 with 2 tasks resource profile 0
20:14:04.142 INFO TaskSetManager - Starting task 0.0 in stage 83.0 (TID 124) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
20:14:04.142 INFO TaskSetManager - Starting task 1.0 in stage 83.0 (TID 125) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
20:14:04.142 INFO Executor - Running task 0.0 in stage 83.0 (TID 124)
20:14:04.142 INFO Executor - Running task 1.0 in stage 83.0 (TID 125)
20:14:04.173 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest112462478906273271266.bam/part-r-00001.bam:0+129330
20:14:04.184 INFO Executor - Finished task 0.0 in stage 83.0 (TID 124). 989 bytes result sent to driver
20:14:04.184 INFO TaskSetManager - Finished task 0.0 in stage 83.0 (TID 124) in 43 ms on localhost (executor driver) (1/2)
20:14:04.187 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest112462478906273271266.bam/part-r-00000.bam:0+132492
20:14:04.200 INFO Executor - Finished task 1.0 in stage 83.0 (TID 125). 989 bytes result sent to driver
20:14:04.200 INFO TaskSetManager - Finished task 1.0 in stage 83.0 (TID 125) in 58 ms on localhost (executor driver) (2/2)
20:14:04.200 INFO TaskSchedulerImpl - Removed TaskSet 83.0, whose tasks have all completed, from pool
20:14:04.200 INFO DAGScheduler - ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.078 s
20:14:04.200 INFO DAGScheduler - Job 61 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.200 INFO TaskSchedulerImpl - Killing all running tasks in stage 83: Stage finished
20:14:04.201 INFO DAGScheduler - Job 61 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.079479 s
20:14:04.204 INFO MemoryStore - Block broadcast_150 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
20:14:04.210 INFO MemoryStore - Block broadcast_150_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
20:14:04.210 INFO BlockManagerInfo - Added broadcast_150_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:04.211 INFO SparkContext - Created broadcast 150 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.235 INFO MemoryStore - Block broadcast_151 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
20:14:04.241 INFO MemoryStore - Block broadcast_151_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
20:14:04.241 INFO BlockManagerInfo - Added broadcast_151_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:04.242 INFO SparkContext - Created broadcast 151 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.262 INFO MemoryStore - Block broadcast_152 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:14:04.263 INFO MemoryStore - Block broadcast_152_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:14:04.263 INFO BlockManagerInfo - Added broadcast_152_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.4 MiB)
20:14:04.263 INFO SparkContext - Created broadcast 152 from broadcast at ReadsSparkSink.java:133
20:14:04.265 INFO MemoryStore - Block broadcast_153 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
20:14:04.272 INFO MemoryStore - Block broadcast_153_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
20:14:04.273 INFO BlockManagerInfo - Added broadcast_153_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.4 MiB)
20:14:04.273 INFO BlockManagerInfo - Removed broadcast_149_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.6 MiB)
20:14:04.273 INFO SparkContext - Created broadcast 153 from broadcast at AnySamSinkMultiple.java:80
20:14:04.274 INFO BlockManagerInfo - Removed broadcast_147_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:04.274 INFO BlockManagerInfo - Removed broadcast_143_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.6 MiB)
20:14:04.275 INFO BlockManagerInfo - Removed broadcast_148_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.6 MiB)
20:14:04.276 INFO BlockManagerInfo - Removed broadcast_146_piece0 on localhost:35739 in memory (size: 56.3 KiB, free: 1919.7 MiB)
20:14:04.276 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.276 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.276 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.276 INFO BlockManagerInfo - Removed broadcast_145_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.8 MiB)
20:14:04.277 INFO BlockManagerInfo - Removed broadcast_151_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:04.277 INFO BlockManagerInfo - Removed broadcast_141_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:04.278 INFO BlockManagerInfo - Removed broadcast_144_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.9 MiB)
20:14:04.289 INFO FileInputFormat - Total input files to process : 1
20:14:04.296 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:04.296 INFO DAGScheduler - Registering RDD 350 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 18
20:14:04.296 INFO DAGScheduler - Got job 62 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:04.296 INFO DAGScheduler - Final stage: ResultStage 85 (runJob at SparkHadoopWriter.scala:83)
20:14:04.296 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 84)
20:14:04.296 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 84)
20:14:04.297 INFO DAGScheduler - Submitting ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:04.321 INFO MemoryStore - Block broadcast_154 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
20:14:04.322 INFO MemoryStore - Block broadcast_154_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.8 MiB)
20:14:04.322 INFO BlockManagerInfo - Added broadcast_154_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.8 MiB)
20:14:04.323 INFO SparkContext - Created broadcast 154 from broadcast at DAGScheduler.scala:1580
20:14:04.323 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:04.323 INFO TaskSchedulerImpl - Adding task set 84.0 with 1 tasks resource profile 0
20:14:04.324 INFO TaskSetManager - Starting task 0.0 in stage 84.0 (TID 126) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:04.324 INFO Executor - Running task 0.0 in stage 84.0 (TID 126)
20:14:04.353 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:04.371 INFO Executor - Finished task 0.0 in stage 84.0 (TID 126). 1149 bytes result sent to driver
20:14:04.371 INFO TaskSetManager - Finished task 0.0 in stage 84.0 (TID 126) in 48 ms on localhost (executor driver) (1/1)
20:14:04.371 INFO TaskSchedulerImpl - Removed TaskSet 84.0, whose tasks have all completed, from pool
20:14:04.371 INFO DAGScheduler - ShuffleMapStage 84 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.073 s
20:14:04.372 INFO DAGScheduler - looking for newly runnable stages
20:14:04.372 INFO DAGScheduler - running: HashSet()
20:14:04.372 INFO DAGScheduler - waiting: HashSet(ResultStage 85)
20:14:04.372 INFO DAGScheduler - failed: HashSet()
20:14:04.372 INFO DAGScheduler - Submitting ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:04.378 INFO MemoryStore - Block broadcast_155 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
20:14:04.379 INFO MemoryStore - Block broadcast_155_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1918.6 MiB)
20:14:04.379 INFO BlockManagerInfo - Added broadcast_155_piece0 in memory on localhost:35739 (size: 56.2 KiB, free: 1919.7 MiB)
20:14:04.380 INFO SparkContext - Created broadcast 155 from broadcast at DAGScheduler.scala:1580
20:14:04.380 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.380 INFO TaskSchedulerImpl - Adding task set 85.0 with 2 tasks resource profile 0
20:14:04.380 INFO TaskSetManager - Starting task 0.0 in stage 85.0 (TID 127) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:04.381 INFO TaskSetManager - Starting task 1.0 in stage 85.0 (TID 128) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:04.381 INFO Executor - Running task 0.0 in stage 85.0 (TID 127)
20:14:04.381 INFO Executor - Running task 1.0 in stage 85.0 (TID 128)
20:14:04.387 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.387 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.387 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.387 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.387 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.387 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.387 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.387 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.387 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.388 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.388 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.388 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.397 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.397 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.402 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.402 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.405 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014042717423550429546367_0362_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest117975595317650464137.bam/_temporary/0/task_202502102014042717423550429546367_0362_r_000001
20:14:04.405 INFO SparkHadoopMapRedUtil - attempt_202502102014042717423550429546367_0362_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:04.406 INFO Executor - Finished task 1.0 in stage 85.0 (TID 128). 1729 bytes result sent to driver
20:14:04.407 INFO TaskSetManager - Finished task 1.0 in stage 85.0 (TID 128) in 26 ms on localhost (executor driver) (1/2)
20:14:04.410 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014042717423550429546367_0362_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest117975595317650464137.bam/_temporary/0/task_202502102014042717423550429546367_0362_r_000000
20:14:04.410 INFO SparkHadoopMapRedUtil - attempt_202502102014042717423550429546367_0362_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:04.410 INFO Executor - Finished task 0.0 in stage 85.0 (TID 127). 1729 bytes result sent to driver
20:14:04.410 INFO TaskSetManager - Finished task 0.0 in stage 85.0 (TID 127) in 30 ms on localhost (executor driver) (2/2)
20:14:04.411 INFO DAGScheduler - ResultStage 85 (runJob at SparkHadoopWriter.scala:83) finished in 0.039 s
20:14:04.411 INFO DAGScheduler - Job 62 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.411 INFO TaskSchedulerImpl - Removed TaskSet 85.0, whose tasks have all completed, from pool
20:14:04.411 INFO TaskSchedulerImpl - Killing all running tasks in stage 85: Stage finished
20:14:04.411 INFO DAGScheduler - Job 62 finished: runJob at SparkHadoopWriter.scala:83, took 0.115388 s
20:14:04.412 INFO SparkHadoopWriter - Start to commit write Job job_202502102014042717423550429546367_0362.
20:14:04.417 INFO SparkHadoopWriter - Write Job job_202502102014042717423550429546367_0362 committed. Elapsed time: 5 ms.
20:14:04.421 INFO MemoryStore - Block broadcast_156 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:04.431 INFO MemoryStore - Block broadcast_156_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:04.432 INFO BlockManagerInfo - Added broadcast_156_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:04.432 INFO SparkContext - Created broadcast 156 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.455 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:04.455 INFO DAGScheduler - Got job 63 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:04.456 INFO DAGScheduler - Final stage: ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222)
20:14:04.456 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 86)
20:14:04.456 INFO DAGScheduler - Missing parents: List()
20:14:04.456 INFO DAGScheduler - Submitting ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:04.457 INFO MemoryStore - Block broadcast_157 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
20:14:04.457 INFO MemoryStore - Block broadcast_157_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
20:14:04.457 INFO BlockManagerInfo - Added broadcast_157_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.7 MiB)
20:14:04.457 INFO SparkContext - Created broadcast 157 from broadcast at DAGScheduler.scala:1580
20:14:04.458 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.458 INFO TaskSchedulerImpl - Adding task set 87.0 with 2 tasks resource profile 0
20:14:04.458 INFO TaskSetManager - Starting task 0.0 in stage 87.0 (TID 129) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:04.459 INFO TaskSetManager - Starting task 1.0 in stage 87.0 (TID 130) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:04.459 INFO Executor - Running task 1.0 in stage 87.0 (TID 130)
20:14:04.459 INFO Executor - Running task 0.0 in stage 87.0 (TID 129)
20:14:04.461 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.461 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.461 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.461 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.466 INFO Executor - Finished task 1.0 in stage 87.0 (TID 130). 1634 bytes result sent to driver
20:14:04.466 INFO Executor - Finished task 0.0 in stage 87.0 (TID 129). 1634 bytes result sent to driver
20:14:04.466 INFO TaskSetManager - Finished task 1.0 in stage 87.0 (TID 130) in 7 ms on localhost (executor driver) (1/2)
20:14:04.466 INFO TaskSetManager - Finished task 0.0 in stage 87.0 (TID 129) in 8 ms on localhost (executor driver) (2/2)
20:14:04.466 INFO TaskSchedulerImpl - Removed TaskSet 87.0, whose tasks have all completed, from pool
20:14:04.466 INFO DAGScheduler - ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
20:14:04.466 INFO DAGScheduler - Job 63 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.467 INFO TaskSchedulerImpl - Killing all running tasks in stage 87: Stage finished
20:14:04.467 INFO DAGScheduler - Job 63 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011507 s
20:14:04.479 INFO FileInputFormat - Total input files to process : 2
20:14:04.483 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:04.483 INFO DAGScheduler - Got job 64 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:04.483 INFO DAGScheduler - Final stage: ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222)
20:14:04.483 INFO DAGScheduler - Parents of final stage: List()
20:14:04.483 INFO DAGScheduler - Missing parents: List()
20:14:04.484 INFO DAGScheduler - Submitting ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:04.501 INFO MemoryStore - Block broadcast_158 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:14:04.502 INFO MemoryStore - Block broadcast_158_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:14:04.502 INFO BlockManagerInfo - Added broadcast_158_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:04.502 INFO SparkContext - Created broadcast 158 from broadcast at DAGScheduler.scala:1580
20:14:04.503 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.503 INFO TaskSchedulerImpl - Adding task set 88.0 with 2 tasks resource profile 0
20:14:04.503 INFO TaskSetManager - Starting task 0.0 in stage 88.0 (TID 131) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
20:14:04.503 INFO TaskSetManager - Starting task 1.0 in stage 88.0 (TID 132) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
20:14:04.504 INFO Executor - Running task 0.0 in stage 88.0 (TID 131)
20:14:04.504 INFO Executor - Running task 1.0 in stage 88.0 (TID 132)
20:14:04.542 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117975595317650464137.bam/part-r-00000.bam:0+132492
20:14:04.549 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117975595317650464137.bam/part-r-00001.bam:0+129330
20:14:04.551 INFO Executor - Finished task 1.0 in stage 88.0 (TID 132). 989 bytes result sent to driver
20:14:04.551 INFO TaskSetManager - Finished task 1.0 in stage 88.0 (TID 132) in 48 ms on localhost (executor driver) (1/2)
20:14:04.562 INFO Executor - Finished task 0.0 in stage 88.0 (TID 131). 989 bytes result sent to driver
20:14:04.562 INFO TaskSetManager - Finished task 0.0 in stage 88.0 (TID 131) in 59 ms on localhost (executor driver) (2/2)
20:14:04.562 INFO TaskSchedulerImpl - Removed TaskSet 88.0, whose tasks have all completed, from pool
20:14:04.563 INFO DAGScheduler - ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.079 s
20:14:04.563 INFO DAGScheduler - Job 64 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.563 INFO TaskSchedulerImpl - Killing all running tasks in stage 88: Stage finished
20:14:04.563 INFO DAGScheduler - Job 64 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.079937 s
20:14:04.567 INFO MemoryStore - Block broadcast_159 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
20:14:04.577 INFO MemoryStore - Block broadcast_159_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
20:14:04.577 INFO BlockManagerInfo - Added broadcast_159_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:04.578 INFO SparkContext - Created broadcast 159 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.611 INFO MemoryStore - Block broadcast_160 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
20:14:04.618 INFO MemoryStore - Block broadcast_160_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
20:14:04.618 INFO BlockManagerInfo - Added broadcast_160_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:04.618 INFO SparkContext - Created broadcast 160 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.638 INFO MemoryStore - Block broadcast_161 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:14:04.645 INFO BlockManagerInfo - Removed broadcast_153_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:04.645 INFO MemoryStore - Block broadcast_161_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:14:04.645 INFO BlockManagerInfo - Added broadcast_161_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.4 MiB)
20:14:04.646 INFO SparkContext - Created broadcast 161 from broadcast at ReadsSparkSink.java:133
20:14:04.646 INFO BlockManagerInfo - Removed broadcast_156_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:04.646 INFO BlockManagerInfo - Removed broadcast_152_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:04.647 INFO BlockManagerInfo - Removed broadcast_150_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:04.647 INFO BlockManagerInfo - Removed broadcast_158_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:04.648 INFO MemoryStore - Block broadcast_162 stored as values in memory (estimated size 163.2 KiB, free 1918.2 MiB)
20:14:04.648 INFO BlockManagerInfo - Removed broadcast_155_piece0 on localhost:35739 in memory (size: 56.2 KiB, free: 1919.7 MiB)
20:14:04.649 INFO BlockManagerInfo - Removed broadcast_160_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:04.649 INFO MemoryStore - Block broadcast_162_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.7 MiB)
20:14:04.649 INFO BlockManagerInfo - Added broadcast_162_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.8 MiB)
20:14:04.649 INFO SparkContext - Created broadcast 162 from broadcast at AnySamSinkMultiple.java:80
20:14:04.649 INFO BlockManagerInfo - Removed broadcast_154_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.9 MiB)
20:14:04.650 INFO BlockManagerInfo - Removed broadcast_157_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.9 MiB)
20:14:04.652 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.652 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.652 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.666 INFO FileInputFormat - Total input files to process : 1
20:14:04.676 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:04.677 INFO DAGScheduler - Registering RDD 377 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 19
20:14:04.677 INFO DAGScheduler - Got job 65 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:04.677 INFO DAGScheduler - Final stage: ResultStage 90 (runJob at SparkHadoopWriter.scala:83)
20:14:04.677 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 89)
20:14:04.677 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 89)
20:14:04.678 INFO DAGScheduler - Submitting ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:04.695 INFO MemoryStore - Block broadcast_163 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
20:14:04.696 INFO MemoryStore - Block broadcast_163_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.8 MiB)
20:14:04.696 INFO BlockManagerInfo - Added broadcast_163_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.8 MiB)
20:14:04.696 INFO SparkContext - Created broadcast 163 from broadcast at DAGScheduler.scala:1580
20:14:04.697 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:04.697 INFO TaskSchedulerImpl - Adding task set 89.0 with 1 tasks resource profile 0
20:14:04.697 INFO TaskSetManager - Starting task 0.0 in stage 89.0 (TID 133) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:04.698 INFO Executor - Running task 0.0 in stage 89.0 (TID 133)
20:14:04.731 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:04.748 INFO Executor - Finished task 0.0 in stage 89.0 (TID 133). 1149 bytes result sent to driver
20:14:04.748 INFO TaskSetManager - Finished task 0.0 in stage 89.0 (TID 133) in 51 ms on localhost (executor driver) (1/1)
20:14:04.748 INFO TaskSchedulerImpl - Removed TaskSet 89.0, whose tasks have all completed, from pool
20:14:04.749 INFO DAGScheduler - ShuffleMapStage 89 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.071 s
20:14:04.749 INFO DAGScheduler - looking for newly runnable stages
20:14:04.749 INFO DAGScheduler - running: HashSet()
20:14:04.749 INFO DAGScheduler - waiting: HashSet(ResultStage 90)
20:14:04.749 INFO DAGScheduler - failed: HashSet()
20:14:04.749 INFO DAGScheduler - Submitting ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:04.755 INFO MemoryStore - Block broadcast_164 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
20:14:04.756 INFO MemoryStore - Block broadcast_164_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1918.6 MiB)
20:14:04.756 INFO BlockManagerInfo - Added broadcast_164_piece0 in memory on localhost:35739 (size: 56.2 KiB, free: 1919.7 MiB)
20:14:04.757 INFO SparkContext - Created broadcast 164 from broadcast at DAGScheduler.scala:1580
20:14:04.757 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.757 INFO TaskSchedulerImpl - Adding task set 90.0 with 2 tasks resource profile 0
20:14:04.757 INFO TaskSetManager - Starting task 0.0 in stage 90.0 (TID 134) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:04.758 INFO TaskSetManager - Starting task 1.0 in stage 90.0 (TID 135) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:04.758 INFO Executor - Running task 0.0 in stage 90.0 (TID 134)
20:14:04.758 INFO Executor - Running task 1.0 in stage 90.0 (TID 135)
20:14:04.762 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.762 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.762 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.762 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.763 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.763 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.764 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.764 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.764 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.764 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:04.764 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:04.764 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:04.774 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.774 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.778 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.778 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.782 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014042048055808093040347_0389_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest117948367070900730266.bam/_temporary/0/task_202502102014042048055808093040347_0389_r_000001
20:14:04.782 INFO SparkHadoopMapRedUtil - attempt_202502102014042048055808093040347_0389_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:04.783 INFO Executor - Finished task 1.0 in stage 90.0 (TID 135). 1729 bytes result sent to driver
20:14:04.784 INFO TaskSetManager - Finished task 1.0 in stage 90.0 (TID 135) in 26 ms on localhost (executor driver) (1/2)
20:14:04.785 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014042048055808093040347_0389_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest117948367070900730266.bam/_temporary/0/task_202502102014042048055808093040347_0389_r_000000
20:14:04.785 INFO SparkHadoopMapRedUtil - attempt_202502102014042048055808093040347_0389_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:04.786 INFO Executor - Finished task 0.0 in stage 90.0 (TID 134). 1729 bytes result sent to driver
20:14:04.786 INFO TaskSetManager - Finished task 0.0 in stage 90.0 (TID 134) in 29 ms on localhost (executor driver) (2/2)
20:14:04.786 INFO TaskSchedulerImpl - Removed TaskSet 90.0, whose tasks have all completed, from pool
20:14:04.786 INFO DAGScheduler - ResultStage 90 (runJob at SparkHadoopWriter.scala:83) finished in 0.037 s
20:14:04.786 INFO DAGScheduler - Job 65 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.786 INFO TaskSchedulerImpl - Killing all running tasks in stage 90: Stage finished
20:14:04.786 INFO DAGScheduler - Job 65 finished: runJob at SparkHadoopWriter.scala:83, took 0.109924 s
20:14:04.787 INFO SparkHadoopWriter - Start to commit write Job job_202502102014042048055808093040347_0389.
20:14:04.792 INFO SparkHadoopWriter - Write Job job_202502102014042048055808093040347_0389 committed. Elapsed time: 4 ms.
20:14:04.794 INFO MemoryStore - Block broadcast_165 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:04.801 INFO MemoryStore - Block broadcast_165_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:04.801 INFO BlockManagerInfo - Added broadcast_165_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:04.801 INFO SparkContext - Created broadcast 165 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.827 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:04.828 INFO DAGScheduler - Got job 66 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:04.828 INFO DAGScheduler - Final stage: ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222)
20:14:04.828 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 91)
20:14:04.828 INFO DAGScheduler - Missing parents: List()
20:14:04.828 INFO DAGScheduler - Submitting ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:04.829 INFO MemoryStore - Block broadcast_166 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
20:14:04.829 INFO MemoryStore - Block broadcast_166_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
20:14:04.829 INFO BlockManagerInfo - Added broadcast_166_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.7 MiB)
20:14:04.830 INFO SparkContext - Created broadcast 166 from broadcast at DAGScheduler.scala:1580
20:14:04.830 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.830 INFO TaskSchedulerImpl - Adding task set 92.0 with 2 tasks resource profile 0
20:14:04.830 INFO TaskSetManager - Starting task 0.0 in stage 92.0 (TID 136) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:04.831 INFO TaskSetManager - Starting task 1.0 in stage 92.0 (TID 137) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:04.831 INFO Executor - Running task 1.0 in stage 92.0 (TID 137)
20:14:04.831 INFO Executor - Running task 0.0 in stage 92.0 (TID 136)
20:14:04.833 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.833 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:04.833 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.833 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:04.836 INFO Executor - Finished task 0.0 in stage 92.0 (TID 136). 1591 bytes result sent to driver
20:14:04.837 INFO TaskSetManager - Finished task 0.0 in stage 92.0 (TID 136) in 7 ms on localhost (executor driver) (1/2)
20:14:04.837 INFO Executor - Finished task 1.0 in stage 92.0 (TID 137). 1591 bytes result sent to driver
20:14:04.837 INFO TaskSetManager - Finished task 1.0 in stage 92.0 (TID 137) in 6 ms on localhost (executor driver) (2/2)
20:14:04.837 INFO TaskSchedulerImpl - Removed TaskSet 92.0, whose tasks have all completed, from pool
20:14:04.838 INFO DAGScheduler - ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
20:14:04.838 INFO DAGScheduler - Job 66 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.838 INFO TaskSchedulerImpl - Killing all running tasks in stage 92: Stage finished
20:14:04.838 INFO DAGScheduler - Job 66 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010379 s
20:14:04.851 INFO FileInputFormat - Total input files to process : 2
20:14:04.854 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:04.855 INFO DAGScheduler - Got job 67 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:04.855 INFO DAGScheduler - Final stage: ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222)
20:14:04.855 INFO DAGScheduler - Parents of final stage: List()
20:14:04.855 INFO DAGScheduler - Missing parents: List()
20:14:04.855 INFO DAGScheduler - Submitting ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:04.871 INFO MemoryStore - Block broadcast_167 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:14:04.873 INFO MemoryStore - Block broadcast_167_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:14:04.873 INFO BlockManagerInfo - Added broadcast_167_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:04.873 INFO SparkContext - Created broadcast 167 from broadcast at DAGScheduler.scala:1580
20:14:04.873 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:04.873 INFO TaskSchedulerImpl - Adding task set 93.0 with 2 tasks resource profile 0
20:14:04.874 INFO TaskSetManager - Starting task 0.0 in stage 93.0 (TID 138) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
20:14:04.874 INFO TaskSetManager - Starting task 1.0 in stage 93.0 (TID 139) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
20:14:04.874 INFO Executor - Running task 0.0 in stage 93.0 (TID 138)
20:14:04.874 INFO Executor - Running task 1.0 in stage 93.0 (TID 139)
20:14:04.903 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117948367070900730266.bam/part-r-00001.bam:0+129330
20:14:04.912 INFO Executor - Finished task 0.0 in stage 93.0 (TID 138). 989 bytes result sent to driver
20:14:04.913 INFO TaskSetManager - Finished task 0.0 in stage 93.0 (TID 138) in 39 ms on localhost (executor driver) (1/2)
20:14:04.918 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117948367070900730266.bam/part-r-00000.bam:0+132492
20:14:04.927 INFO Executor - Finished task 1.0 in stage 93.0 (TID 139). 989 bytes result sent to driver
20:14:04.927 INFO TaskSetManager - Finished task 1.0 in stage 93.0 (TID 139) in 53 ms on localhost (executor driver) (2/2)
20:14:04.927 INFO TaskSchedulerImpl - Removed TaskSet 93.0, whose tasks have all completed, from pool
20:14:04.927 INFO DAGScheduler - ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.072 s
20:14:04.927 INFO DAGScheduler - Job 67 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:04.927 INFO TaskSchedulerImpl - Killing all running tasks in stage 93: Stage finished
20:14:04.928 INFO DAGScheduler - Job 67 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.073184 s
20:14:04.931 INFO MemoryStore - Block broadcast_168 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
20:14:04.941 INFO MemoryStore - Block broadcast_168_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
20:14:04.942 INFO BlockManagerInfo - Added broadcast_168_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:04.942 INFO SparkContext - Created broadcast 168 from newAPIHadoopFile at PathSplitSource.java:96
20:14:04.975 INFO MemoryStore - Block broadcast_169 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
20:14:04.985 INFO BlockManagerInfo - Removed broadcast_167_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.6 MiB)
20:14:04.985 INFO BlockManagerInfo - Removed broadcast_164_piece0 on localhost:35739 in memory (size: 56.2 KiB, free: 1919.7 MiB)
20:14:04.986 INFO BlockManagerInfo - Removed broadcast_162_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:14:04.986 INFO BlockManagerInfo - Removed broadcast_163_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.8 MiB)
20:14:04.987 INFO BlockManagerInfo - Removed broadcast_165_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:04.988 INFO BlockManagerInfo - Removed broadcast_159_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:04.988 INFO BlockManagerInfo - Removed broadcast_161_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.9 MiB)
20:14:04.989 INFO BlockManagerInfo - Removed broadcast_166_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1920.0 MiB)
20:14:04.991 INFO MemoryStore - Block broadcast_169_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:14:04.991 INFO BlockManagerInfo - Added broadcast_169_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:04.992 INFO SparkContext - Created broadcast 169 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.012 INFO MemoryStore - Block broadcast_170 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
20:14:05.013 INFO MemoryStore - Block broadcast_170_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
20:14:05.013 INFO BlockManagerInfo - Added broadcast_170_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:14:05.013 INFO SparkContext - Created broadcast 170 from broadcast at ReadsSparkSink.java:133
20:14:05.015 INFO MemoryStore - Block broadcast_171 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
20:14:05.015 INFO MemoryStore - Block broadcast_171_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
20:14:05.016 INFO BlockManagerInfo - Added broadcast_171_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:14:05.016 INFO SparkContext - Created broadcast 171 from broadcast at AnySamSinkMultiple.java:80
20:14:05.018 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.018 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.018 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.030 INFO FileInputFormat - Total input files to process : 1
20:14:05.036 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:05.036 INFO DAGScheduler - Registering RDD 404 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 20
20:14:05.037 INFO DAGScheduler - Got job 68 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:05.037 INFO DAGScheduler - Final stage: ResultStage 95 (runJob at SparkHadoopWriter.scala:83)
20:14:05.037 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 94)
20:14:05.037 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 94)
20:14:05.037 INFO DAGScheduler - Submitting ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:05.054 INFO MemoryStore - Block broadcast_172 stored as values in memory (estimated size 427.7 KiB, free 1918.6 MiB)
20:14:05.056 INFO MemoryStore - Block broadcast_172_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.4 MiB)
20:14:05.056 INFO BlockManagerInfo - Added broadcast_172_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.7 MiB)
20:14:05.056 INFO SparkContext - Created broadcast 172 from broadcast at DAGScheduler.scala:1580
20:14:05.057 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:05.057 INFO TaskSchedulerImpl - Adding task set 94.0 with 1 tasks resource profile 0
20:14:05.057 INFO TaskSetManager - Starting task 0.0 in stage 94.0 (TID 140) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:05.057 INFO Executor - Running task 0.0 in stage 94.0 (TID 140)
20:14:05.087 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:05.104 INFO Executor - Finished task 0.0 in stage 94.0 (TID 140). 1149 bytes result sent to driver
20:14:05.104 INFO TaskSetManager - Finished task 0.0 in stage 94.0 (TID 140) in 47 ms on localhost (executor driver) (1/1)
20:14:05.104 INFO TaskSchedulerImpl - Removed TaskSet 94.0, whose tasks have all completed, from pool
20:14:05.104 INFO DAGScheduler - ShuffleMapStage 94 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.067 s
20:14:05.104 INFO DAGScheduler - looking for newly runnable stages
20:14:05.104 INFO DAGScheduler - running: HashSet()
20:14:05.104 INFO DAGScheduler - waiting: HashSet(ResultStage 95)
20:14:05.104 INFO DAGScheduler - failed: HashSet()
20:14:05.105 INFO DAGScheduler - Submitting ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:05.111 INFO MemoryStore - Block broadcast_173 stored as values in memory (estimated size 150.2 KiB, free 1918.3 MiB)
20:14:05.112 INFO MemoryStore - Block broadcast_173_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1918.2 MiB)
20:14:05.112 INFO BlockManagerInfo - Added broadcast_173_piece0 in memory on localhost:35739 (size: 56.2 KiB, free: 1919.7 MiB)
20:14:05.112 INFO SparkContext - Created broadcast 173 from broadcast at DAGScheduler.scala:1580
20:14:05.112 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.112 INFO TaskSchedulerImpl - Adding task set 95.0 with 2 tasks resource profile 0
20:14:05.113 INFO TaskSetManager - Starting task 0.0 in stage 95.0 (TID 141) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:05.113 INFO TaskSetManager - Starting task 1.0 in stage 95.0 (TID 142) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:05.114 INFO Executor - Running task 0.0 in stage 95.0 (TID 141)
20:14:05.114 INFO Executor - Running task 1.0 in stage 95.0 (TID 142)
20:14:05.118 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.118 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.118 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.119 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.119 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.119 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.120 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.120 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.120 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.120 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.120 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.120 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.130 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.130 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.135 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.135 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.137 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014057448461642796000463_0416_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest18229204570767761383.bam/_temporary/0/task_202502102014057448461642796000463_0416_r_000001
20:14:05.137 INFO SparkHadoopMapRedUtil - attempt_202502102014057448461642796000463_0416_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:05.137 INFO Executor - Finished task 1.0 in stage 95.0 (TID 142). 1729 bytes result sent to driver
20:14:05.138 INFO TaskSetManager - Finished task 1.0 in stage 95.0 (TID 142) in 25 ms on localhost (executor driver) (1/2)
20:14:05.142 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014057448461642796000463_0416_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest18229204570767761383.bam/_temporary/0/task_202502102014057448461642796000463_0416_r_000000
20:14:05.142 INFO SparkHadoopMapRedUtil - attempt_202502102014057448461642796000463_0416_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:05.143 INFO Executor - Finished task 0.0 in stage 95.0 (TID 141). 1729 bytes result sent to driver
20:14:05.143 INFO TaskSetManager - Finished task 0.0 in stage 95.0 (TID 141) in 30 ms on localhost (executor driver) (2/2)
20:14:05.143 INFO TaskSchedulerImpl - Removed TaskSet 95.0, whose tasks have all completed, from pool
20:14:05.143 INFO DAGScheduler - ResultStage 95 (runJob at SparkHadoopWriter.scala:83) finished in 0.038 s
20:14:05.143 INFO DAGScheduler - Job 68 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.143 INFO TaskSchedulerImpl - Killing all running tasks in stage 95: Stage finished
20:14:05.143 INFO DAGScheduler - Job 68 finished: runJob at SparkHadoopWriter.scala:83, took 0.107209 s
20:14:05.144 INFO SparkHadoopWriter - Start to commit write Job job_202502102014057448461642796000463_0416.
20:14:05.149 INFO SparkHadoopWriter - Write Job job_202502102014057448461642796000463_0416 committed. Elapsed time: 5 ms.
20:14:05.152 INFO MemoryStore - Block broadcast_174 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
20:14:05.162 INFO MemoryStore - Block broadcast_174_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
20:14:05.162 INFO BlockManagerInfo - Added broadcast_174_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:05.163 INFO SparkContext - Created broadcast 174 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.185 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:05.186 INFO DAGScheduler - Got job 69 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:05.186 INFO DAGScheduler - Final stage: ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222)
20:14:05.186 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 96)
20:14:05.186 INFO DAGScheduler - Missing parents: List()
20:14:05.186 INFO DAGScheduler - Submitting ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:05.187 INFO MemoryStore - Block broadcast_175 stored as values in memory (estimated size 6.3 KiB, free 1917.9 MiB)
20:14:05.187 INFO MemoryStore - Block broadcast_175_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1917.9 MiB)
20:14:05.187 INFO BlockManagerInfo - Added broadcast_175_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.6 MiB)
20:14:05.188 INFO SparkContext - Created broadcast 175 from broadcast at DAGScheduler.scala:1580
20:14:05.188 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.188 INFO TaskSchedulerImpl - Adding task set 97.0 with 2 tasks resource profile 0
20:14:05.189 INFO TaskSetManager - Starting task 0.0 in stage 97.0 (TID 143) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:05.189 INFO TaskSetManager - Starting task 1.0 in stage 97.0 (TID 144) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:05.189 INFO Executor - Running task 0.0 in stage 97.0 (TID 143)
20:14:05.189 INFO Executor - Running task 1.0 in stage 97.0 (TID 144)
20:14:05.191 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.191 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.191 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.191 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.194 INFO Executor - Finished task 1.0 in stage 97.0 (TID 144). 1591 bytes result sent to driver
20:14:05.194 INFO TaskSetManager - Finished task 1.0 in stage 97.0 (TID 144) in 5 ms on localhost (executor driver) (1/2)
20:14:05.195 INFO Executor - Finished task 0.0 in stage 97.0 (TID 143). 1591 bytes result sent to driver
20:14:05.195 INFO TaskSetManager - Finished task 0.0 in stage 97.0 (TID 143) in 7 ms on localhost (executor driver) (2/2)
20:14:05.195 INFO TaskSchedulerImpl - Removed TaskSet 97.0, whose tasks have all completed, from pool
20:14:05.195 INFO DAGScheduler - ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
20:14:05.195 INFO DAGScheduler - Job 69 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.195 INFO TaskSchedulerImpl - Killing all running tasks in stage 97: Stage finished
20:14:05.195 INFO DAGScheduler - Job 69 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.009896 s
20:14:05.208 INFO FileInputFormat - Total input files to process : 2
20:14:05.212 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:05.212 INFO DAGScheduler - Got job 70 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:05.212 INFO DAGScheduler - Final stage: ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222)
20:14:05.212 INFO DAGScheduler - Parents of final stage: List()
20:14:05.212 INFO DAGScheduler - Missing parents: List()
20:14:05.212 INFO DAGScheduler - Submitting ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:05.231 INFO MemoryStore - Block broadcast_176 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:14:05.232 INFO MemoryStore - Block broadcast_176_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.3 MiB)
20:14:05.233 INFO BlockManagerInfo - Added broadcast_176_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:05.233 INFO SparkContext - Created broadcast 176 from broadcast at DAGScheduler.scala:1580
20:14:05.233 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.233 INFO TaskSchedulerImpl - Adding task set 98.0 with 2 tasks resource profile 0
20:14:05.234 INFO TaskSetManager - Starting task 0.0 in stage 98.0 (TID 145) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
20:14:05.234 INFO TaskSetManager - Starting task 1.0 in stage 98.0 (TID 146) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
20:14:05.234 INFO Executor - Running task 0.0 in stage 98.0 (TID 145)
20:14:05.234 INFO Executor - Running task 1.0 in stage 98.0 (TID 146)
20:14:05.277 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest18229204570767761383.bam/part-r-00000.bam:0+132492
20:14:05.277 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest18229204570767761383.bam/part-r-00001.bam:0+129330
20:14:05.293 INFO Executor - Finished task 0.0 in stage 98.0 (TID 145). 1075 bytes result sent to driver
20:14:05.294 INFO TaskSetManager - Finished task 0.0 in stage 98.0 (TID 145) in 60 ms on localhost (executor driver) (1/2)
20:14:05.294 INFO BlockManagerInfo - Removed broadcast_170_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:05.296 INFO BlockManagerInfo - Removed broadcast_175_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.5 MiB)
20:14:05.297 INFO BlockManagerInfo - Removed broadcast_169_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:05.298 INFO Executor - Finished task 1.0 in stage 98.0 (TID 146). 1032 bytes result sent to driver
20:14:05.298 INFO TaskSetManager - Finished task 1.0 in stage 98.0 (TID 146) in 64 ms on localhost (executor driver) (2/2)
20:14:05.298 INFO TaskSchedulerImpl - Removed TaskSet 98.0, whose tasks have all completed, from pool
20:14:05.298 INFO DAGScheduler - ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.085 s
20:14:05.298 INFO DAGScheduler - Job 70 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.298 INFO TaskSchedulerImpl - Killing all running tasks in stage 98: Stage finished
20:14:05.299 INFO BlockManagerInfo - Removed broadcast_172_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.7 MiB)
20:14:05.299 INFO DAGScheduler - Job 70 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.087551 s
20:14:05.300 INFO BlockManagerInfo - Removed broadcast_173_piece0 on localhost:35739 in memory (size: 56.2 KiB, free: 1919.7 MiB)
20:14:05.301 INFO BlockManagerInfo - Removed broadcast_171_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:05.303 INFO MemoryStore - Block broadcast_177 stored as values in memory (estimated size 298.0 KiB, free 1918.5 MiB)
20:14:05.314 INFO MemoryStore - Block broadcast_177_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.4 MiB)
20:14:05.314 INFO BlockManagerInfo - Added broadcast_177_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.7 MiB)
20:14:05.315 INFO SparkContext - Created broadcast 177 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.355 INFO MemoryStore - Block broadcast_178 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
20:14:05.380 INFO MemoryStore - Block broadcast_178_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.1 MiB)
20:14:05.380 INFO BlockManagerInfo - Added broadcast_178_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.7 MiB)
20:14:05.380 INFO SparkContext - Created broadcast 178 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.416 INFO MemoryStore - Block broadcast_179 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
20:14:05.417 INFO MemoryStore - Block broadcast_179_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
20:14:05.417 INFO BlockManagerInfo - Added broadcast_179_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:05.417 INFO SparkContext - Created broadcast 179 from broadcast at ReadsSparkSink.java:133
20:14:05.419 INFO MemoryStore - Block broadcast_180 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
20:14:05.420 INFO MemoryStore - Block broadcast_180_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
20:14:05.420 INFO BlockManagerInfo - Added broadcast_180_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:05.420 INFO SparkContext - Created broadcast 180 from broadcast at AnySamSinkMultiple.java:80
20:14:05.423 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.423 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.423 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.442 INFO FileInputFormat - Total input files to process : 1
20:14:05.449 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:05.450 INFO DAGScheduler - Registering RDD 431 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 21
20:14:05.450 INFO DAGScheduler - Got job 71 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:05.450 INFO DAGScheduler - Final stage: ResultStage 100 (runJob at SparkHadoopWriter.scala:83)
20:14:05.450 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 99)
20:14:05.450 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 99)
20:14:05.450 INFO DAGScheduler - Submitting ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:05.468 INFO MemoryStore - Block broadcast_181 stored as values in memory (estimated size 427.7 KiB, free 1917.3 MiB)
20:14:05.470 INFO MemoryStore - Block broadcast_181_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.2 MiB)
20:14:05.470 INFO BlockManagerInfo - Added broadcast_181_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.5 MiB)
20:14:05.470 INFO SparkContext - Created broadcast 181 from broadcast at DAGScheduler.scala:1580
20:14:05.470 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:05.471 INFO TaskSchedulerImpl - Adding task set 99.0 with 1 tasks resource profile 0
20:14:05.471 INFO TaskSetManager - Starting task 0.0 in stage 99.0 (TID 147) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
20:14:05.472 INFO Executor - Running task 0.0 in stage 99.0 (TID 147)
20:14:05.513 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:14:05.532 INFO Executor - Finished task 0.0 in stage 99.0 (TID 147). 1149 bytes result sent to driver
20:14:05.532 INFO TaskSetManager - Finished task 0.0 in stage 99.0 (TID 147) in 61 ms on localhost (executor driver) (1/1)
20:14:05.532 INFO TaskSchedulerImpl - Removed TaskSet 99.0, whose tasks have all completed, from pool
20:14:05.533 INFO DAGScheduler - ShuffleMapStage 99 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.082 s
20:14:05.533 INFO DAGScheduler - looking for newly runnable stages
20:14:05.533 INFO DAGScheduler - running: HashSet()
20:14:05.533 INFO DAGScheduler - waiting: HashSet(ResultStage 100)
20:14:05.533 INFO DAGScheduler - failed: HashSet()
20:14:05.533 INFO DAGScheduler - Submitting ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:05.543 INFO MemoryStore - Block broadcast_182 stored as values in memory (estimated size 150.2 KiB, free 1917.0 MiB)
20:14:05.544 INFO MemoryStore - Block broadcast_182_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1917.0 MiB)
20:14:05.544 INFO BlockManagerInfo - Added broadcast_182_piece0 in memory on localhost:35739 (size: 56.2 KiB, free: 1919.4 MiB)
20:14:05.544 INFO SparkContext - Created broadcast 182 from broadcast at DAGScheduler.scala:1580
20:14:05.544 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.544 INFO TaskSchedulerImpl - Adding task set 100.0 with 2 tasks resource profile 0
20:14:05.545 INFO TaskSetManager - Starting task 0.0 in stage 100.0 (TID 148) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:05.545 INFO TaskSetManager - Starting task 1.0 in stage 100.0 (TID 149) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:05.545 INFO Executor - Running task 0.0 in stage 100.0 (TID 148)
20:14:05.545 INFO Executor - Running task 1.0 in stage 100.0 (TID 149)
20:14:05.550 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.550 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.550 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.550 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.550 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.550 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.551 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.551 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.551 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.551 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.551 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.551 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.569 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.569 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.571 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.572 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.576 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014053594230537857745029_0443_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest29049512855891920154.bam/_temporary/0/task_202502102014053594230537857745029_0443_r_000000
20:14:05.576 INFO SparkHadoopMapRedUtil - attempt_202502102014053594230537857745029_0443_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:05.576 INFO Executor - Finished task 0.0 in stage 100.0 (TID 148). 1729 bytes result sent to driver
20:14:05.576 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014053594230537857745029_0443_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest29049512855891920154.bam/_temporary/0/task_202502102014053594230537857745029_0443_r_000001
20:14:05.577 INFO SparkHadoopMapRedUtil - attempt_202502102014053594230537857745029_0443_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:05.577 INFO TaskSetManager - Finished task 0.0 in stage 100.0 (TID 148) in 32 ms on localhost (executor driver) (1/2)
20:14:05.577 INFO Executor - Finished task 1.0 in stage 100.0 (TID 149). 1729 bytes result sent to driver
20:14:05.577 INFO TaskSetManager - Finished task 1.0 in stage 100.0 (TID 149) in 32 ms on localhost (executor driver) (2/2)
20:14:05.577 INFO TaskSchedulerImpl - Removed TaskSet 100.0, whose tasks have all completed, from pool
20:14:05.578 INFO DAGScheduler - ResultStage 100 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
20:14:05.578 INFO DAGScheduler - Job 71 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.578 INFO TaskSchedulerImpl - Killing all running tasks in stage 100: Stage finished
20:14:05.578 INFO DAGScheduler - Job 71 finished: runJob at SparkHadoopWriter.scala:83, took 0.128309 s
20:14:05.578 INFO SparkHadoopWriter - Start to commit write Job job_202502102014053594230537857745029_0443.
20:14:05.583 INFO SparkHadoopWriter - Write Job job_202502102014053594230537857745029_0443 committed. Elapsed time: 5 ms.
20:14:05.586 INFO MemoryStore - Block broadcast_183 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
20:14:05.597 INFO MemoryStore - Block broadcast_183_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
20:14:05.597 INFO BlockManagerInfo - Added broadcast_183_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:05.597 INFO SparkContext - Created broadcast 183 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.621 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:05.622 INFO DAGScheduler - Got job 72 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:05.622 INFO DAGScheduler - Final stage: ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222)
20:14:05.622 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 101)
20:14:05.622 INFO DAGScheduler - Missing parents: List()
20:14:05.622 INFO DAGScheduler - Submitting ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:05.623 INFO MemoryStore - Block broadcast_184 stored as values in memory (estimated size 6.3 KiB, free 1916.6 MiB)
20:14:05.623 INFO MemoryStore - Block broadcast_184_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.6 MiB)
20:14:05.623 INFO BlockManagerInfo - Added broadcast_184_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.4 MiB)
20:14:05.623 INFO SparkContext - Created broadcast 184 from broadcast at DAGScheduler.scala:1580
20:14:05.624 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.624 INFO TaskSchedulerImpl - Adding task set 102.0 with 2 tasks resource profile 0
20:14:05.624 INFO TaskSetManager - Starting task 0.0 in stage 102.0 (TID 150) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:05.624 INFO TaskSetManager - Starting task 1.0 in stage 102.0 (TID 151) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:05.625 INFO Executor - Running task 0.0 in stage 102.0 (TID 150)
20:14:05.625 INFO Executor - Running task 1.0 in stage 102.0 (TID 151)
20:14:05.627 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.627 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.627 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.627 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.630 INFO Executor - Finished task 1.0 in stage 102.0 (TID 151). 1591 bytes result sent to driver
20:14:05.630 INFO TaskSetManager - Finished task 1.0 in stage 102.0 (TID 151) in 6 ms on localhost (executor driver) (1/2)
20:14:05.631 INFO Executor - Finished task 0.0 in stage 102.0 (TID 150). 1591 bytes result sent to driver
20:14:05.631 INFO TaskSetManager - Finished task 0.0 in stage 102.0 (TID 150) in 7 ms on localhost (executor driver) (2/2)
20:14:05.631 INFO TaskSchedulerImpl - Removed TaskSet 102.0, whose tasks have all completed, from pool
20:14:05.631 INFO DAGScheduler - ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
20:14:05.631 INFO DAGScheduler - Job 72 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.631 INFO TaskSchedulerImpl - Killing all running tasks in stage 102: Stage finished
20:14:05.631 INFO DAGScheduler - Job 72 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010298 s
20:14:05.644 INFO FileInputFormat - Total input files to process : 2
20:14:05.648 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:05.648 INFO DAGScheduler - Got job 73 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:05.648 INFO DAGScheduler - Final stage: ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222)
20:14:05.648 INFO DAGScheduler - Parents of final stage: List()
20:14:05.648 INFO DAGScheduler - Missing parents: List()
20:14:05.648 INFO DAGScheduler - Submitting ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:05.665 INFO MemoryStore - Block broadcast_185 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
20:14:05.666 INFO MemoryStore - Block broadcast_185_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.1 MiB)
20:14:05.667 INFO BlockManagerInfo - Added broadcast_185_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.2 MiB)
20:14:05.667 INFO SparkContext - Created broadcast 185 from broadcast at DAGScheduler.scala:1580
20:14:05.667 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.667 INFO TaskSchedulerImpl - Adding task set 103.0 with 2 tasks resource profile 0
20:14:05.668 INFO TaskSetManager - Starting task 0.0 in stage 103.0 (TID 152) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
20:14:05.668 INFO TaskSetManager - Starting task 1.0 in stage 103.0 (TID 153) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
20:14:05.668 INFO Executor - Running task 1.0 in stage 103.0 (TID 153)
20:14:05.668 INFO Executor - Running task 0.0 in stage 103.0 (TID 152)
20:14:05.698 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest29049512855891920154.bam/part-r-00000.bam:0+129755
20:14:05.698 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest29049512855891920154.bam/part-r-00001.bam:0+129440
20:14:05.707 INFO Executor - Finished task 1.0 in stage 103.0 (TID 153). 989 bytes result sent to driver
20:14:05.707 INFO TaskSetManager - Finished task 1.0 in stage 103.0 (TID 153) in 39 ms on localhost (executor driver) (1/2)
20:14:05.711 INFO Executor - Finished task 0.0 in stage 103.0 (TID 152). 989 bytes result sent to driver
20:14:05.711 INFO TaskSetManager - Finished task 0.0 in stage 103.0 (TID 152) in 44 ms on localhost (executor driver) (2/2)
20:14:05.711 INFO TaskSchedulerImpl - Removed TaskSet 103.0, whose tasks have all completed, from pool
20:14:05.711 INFO DAGScheduler - ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.063 s
20:14:05.711 INFO DAGScheduler - Job 73 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.711 INFO TaskSchedulerImpl - Killing all running tasks in stage 103: Stage finished
20:14:05.711 INFO DAGScheduler - Job 73 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.063733 s
20:14:05.714 INFO MemoryStore - Block broadcast_186 stored as values in memory (estimated size 298.0 KiB, free 1915.8 MiB)
20:14:05.725 INFO MemoryStore - Block broadcast_186_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.7 MiB)
20:14:05.725 INFO BlockManagerInfo - Added broadcast_186_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:05.726 INFO SparkContext - Created broadcast 186 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.754 INFO MemoryStore - Block broadcast_187 stored as values in memory (estimated size 298.0 KiB, free 1915.4 MiB)
20:14:05.760 INFO MemoryStore - Block broadcast_187_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.4 MiB)
20:14:05.761 INFO BlockManagerInfo - Added broadcast_187_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:14:05.761 INFO SparkContext - Created broadcast 187 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.781 INFO MemoryStore - Block broadcast_188 stored as values in memory (estimated size 19.6 KiB, free 1915.4 MiB)
20:14:05.781 INFO MemoryStore - Block broadcast_188_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1915.4 MiB)
20:14:05.781 INFO BlockManagerInfo - Added broadcast_188_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.1 MiB)
20:14:05.781 INFO SparkContext - Created broadcast 188 from broadcast at ReadsSparkSink.java:133
20:14:05.782 INFO MemoryStore - Block broadcast_189 stored as values in memory (estimated size 20.0 KiB, free 1915.3 MiB)
20:14:05.788 INFO MemoryStore - Block broadcast_189_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1915.3 MiB)
20:14:05.788 INFO BlockManagerInfo - Added broadcast_189_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.1 MiB)
20:14:05.789 INFO SparkContext - Created broadcast 189 from broadcast at AnySamSinkMultiple.java:80
20:14:05.789 INFO BlockManagerInfo - Removed broadcast_185_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.3 MiB)
20:14:05.789 INFO BlockManagerInfo - Removed broadcast_180_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.3 MiB)
20:14:05.790 INFO BlockManagerInfo - Removed broadcast_177_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.3 MiB)
20:14:05.791 INFO BlockManagerInfo - Removed broadcast_179_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.3 MiB)
20:14:05.791 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.792 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.792 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.792 INFO BlockManagerInfo - Removed broadcast_168_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:05.793 INFO BlockManagerInfo - Removed broadcast_183_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:05.794 INFO BlockManagerInfo - Removed broadcast_182_piece0 on localhost:35739 in memory (size: 56.2 KiB, free: 1919.5 MiB)
20:14:05.794 INFO BlockManagerInfo - Removed broadcast_178_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.5 MiB)
20:14:05.795 INFO BlockManagerInfo - Removed broadcast_174_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:05.795 INFO BlockManagerInfo - Removed broadcast_184_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.6 MiB)
20:14:05.796 INFO BlockManagerInfo - Removed broadcast_176_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:05.797 INFO BlockManagerInfo - Removed broadcast_181_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.9 MiB)
20:14:05.797 INFO BlockManagerInfo - Removed broadcast_187_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:05.806 INFO FileInputFormat - Total input files to process : 1
20:14:05.816 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:05.817 INFO DAGScheduler - Registering RDD 458 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 22
20:14:05.817 INFO DAGScheduler - Got job 74 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:05.817 INFO DAGScheduler - Final stage: ResultStage 105 (runJob at SparkHadoopWriter.scala:83)
20:14:05.817 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 104)
20:14:05.817 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 104)
20:14:05.817 INFO DAGScheduler - Submitting ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:05.834 INFO MemoryStore - Block broadcast_190 stored as values in memory (estimated size 427.7 KiB, free 1919.2 MiB)
20:14:05.836 INFO MemoryStore - Block broadcast_190_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1919.0 MiB)
20:14:05.836 INFO BlockManagerInfo - Added broadcast_190_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.8 MiB)
20:14:05.836 INFO SparkContext - Created broadcast 190 from broadcast at DAGScheduler.scala:1580
20:14:05.836 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:05.836 INFO TaskSchedulerImpl - Adding task set 104.0 with 1 tasks resource profile 0
20:14:05.837 INFO TaskSetManager - Starting task 0.0 in stage 104.0 (TID 154) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
20:14:05.837 INFO Executor - Running task 0.0 in stage 104.0 (TID 154)
20:14:05.866 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:05.881 INFO Executor - Finished task 0.0 in stage 104.0 (TID 154). 1149 bytes result sent to driver
20:14:05.882 INFO TaskSetManager - Finished task 0.0 in stage 104.0 (TID 154) in 45 ms on localhost (executor driver) (1/1)
20:14:05.882 INFO TaskSchedulerImpl - Removed TaskSet 104.0, whose tasks have all completed, from pool
20:14:05.882 INFO DAGScheduler - ShuffleMapStage 104 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.064 s
20:14:05.882 INFO DAGScheduler - looking for newly runnable stages
20:14:05.882 INFO DAGScheduler - running: HashSet()
20:14:05.882 INFO DAGScheduler - waiting: HashSet(ResultStage 105)
20:14:05.882 INFO DAGScheduler - failed: HashSet()
20:14:05.882 INFO DAGScheduler - Submitting ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:05.889 INFO MemoryStore - Block broadcast_191 stored as values in memory (estimated size 150.2 KiB, free 1918.9 MiB)
20:14:05.889 INFO MemoryStore - Block broadcast_191_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1918.8 MiB)
20:14:05.890 INFO BlockManagerInfo - Added broadcast_191_piece0 in memory on localhost:35739 (size: 56.3 KiB, free: 1919.7 MiB)
20:14:05.890 INFO SparkContext - Created broadcast 191 from broadcast at DAGScheduler.scala:1580
20:14:05.890 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.890 INFO TaskSchedulerImpl - Adding task set 105.0 with 2 tasks resource profile 0
20:14:05.891 INFO TaskSetManager - Starting task 0.0 in stage 105.0 (TID 155) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:05.891 INFO TaskSetManager - Starting task 1.0 in stage 105.0 (TID 156) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:05.891 INFO Executor - Running task 0.0 in stage 105.0 (TID 155)
20:14:05.891 INFO Executor - Running task 1.0 in stage 105.0 (TID 156)
20:14:05.896 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.896 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.896 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.896 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.896 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.896 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.898 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.898 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.898 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.898 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:05.898 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:05.898 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:05.908 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.908 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.910 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.910 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.917 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014052504446150849022098_0470_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest38783702903870548621.bam/_temporary/0/task_202502102014052504446150849022098_0470_r_000001
20:14:05.917 INFO SparkHadoopMapRedUtil - attempt_202502102014052504446150849022098_0470_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:05.917 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014052504446150849022098_0470_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest38783702903870548621.bam/_temporary/0/task_202502102014052504446150849022098_0470_r_000000
20:14:05.917 INFO SparkHadoopMapRedUtil - attempt_202502102014052504446150849022098_0470_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:05.918 INFO Executor - Finished task 1.0 in stage 105.0 (TID 156). 1729 bytes result sent to driver
20:14:05.918 INFO Executor - Finished task 0.0 in stage 105.0 (TID 155). 1729 bytes result sent to driver
20:14:05.918 INFO TaskSetManager - Finished task 1.0 in stage 105.0 (TID 156) in 27 ms on localhost (executor driver) (1/2)
20:14:05.918 INFO TaskSetManager - Finished task 0.0 in stage 105.0 (TID 155) in 27 ms on localhost (executor driver) (2/2)
20:14:05.918 INFO TaskSchedulerImpl - Removed TaskSet 105.0, whose tasks have all completed, from pool
20:14:05.919 INFO DAGScheduler - ResultStage 105 (runJob at SparkHadoopWriter.scala:83) finished in 0.036 s
20:14:05.919 INFO DAGScheduler - Job 74 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.919 INFO TaskSchedulerImpl - Killing all running tasks in stage 105: Stage finished
20:14:05.919 INFO DAGScheduler - Job 74 finished: runJob at SparkHadoopWriter.scala:83, took 0.102627 s
20:14:05.919 INFO SparkHadoopWriter - Start to commit write Job job_202502102014052504446150849022098_0470.
20:14:05.924 INFO SparkHadoopWriter - Write Job job_202502102014052504446150849022098_0470 committed. Elapsed time: 4 ms.
20:14:05.926 INFO MemoryStore - Block broadcast_192 stored as values in memory (estimated size 297.9 KiB, free 1918.6 MiB)
20:14:05.933 INFO MemoryStore - Block broadcast_192_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.5 MiB)
20:14:05.933 INFO BlockManagerInfo - Added broadcast_192_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:05.933 INFO SparkContext - Created broadcast 192 from newAPIHadoopFile at PathSplitSource.java:96
20:14:05.956 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:05.957 INFO DAGScheduler - Got job 75 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:05.957 INFO DAGScheduler - Final stage: ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222)
20:14:05.957 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 106)
20:14:05.957 INFO DAGScheduler - Missing parents: List()
20:14:05.957 INFO DAGScheduler - Submitting ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:05.958 INFO MemoryStore - Block broadcast_193 stored as values in memory (estimated size 6.3 KiB, free 1918.5 MiB)
20:14:05.959 INFO MemoryStore - Block broadcast_193_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.5 MiB)
20:14:05.959 INFO BlockManagerInfo - Added broadcast_193_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.7 MiB)
20:14:05.959 INFO SparkContext - Created broadcast 193 from broadcast at DAGScheduler.scala:1580
20:14:05.959 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:05.959 INFO TaskSchedulerImpl - Adding task set 107.0 with 2 tasks resource profile 0
20:14:05.960 INFO TaskSetManager - Starting task 0.0 in stage 107.0 (TID 157) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:05.960 INFO TaskSetManager - Starting task 1.0 in stage 107.0 (TID 158) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:05.961 INFO Executor - Running task 0.0 in stage 107.0 (TID 157)
20:14:05.961 INFO Executor - Running task 1.0 in stage 107.0 (TID 158)
20:14:05.962 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.962 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.963 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:05.963 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:05.967 INFO Executor - Finished task 0.0 in stage 107.0 (TID 157). 1634 bytes result sent to driver
20:14:05.967 INFO Executor - Finished task 1.0 in stage 107.0 (TID 158). 1634 bytes result sent to driver
20:14:05.967 INFO TaskSetManager - Finished task 1.0 in stage 107.0 (TID 158) in 7 ms on localhost (executor driver) (1/2)
20:14:05.967 INFO TaskSetManager - Finished task 0.0 in stage 107.0 (TID 157) in 7 ms on localhost (executor driver) (2/2)
20:14:05.967 INFO TaskSchedulerImpl - Removed TaskSet 107.0, whose tasks have all completed, from pool
20:14:05.967 INFO DAGScheduler - ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
20:14:05.967 INFO DAGScheduler - Job 75 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:05.967 INFO TaskSchedulerImpl - Killing all running tasks in stage 107: Stage finished
20:14:05.967 INFO DAGScheduler - Job 75 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.010955 s
20:14:05.980 INFO FileInputFormat - Total input files to process : 2
20:14:05.984 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:05.984 INFO DAGScheduler - Got job 76 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:05.984 INFO DAGScheduler - Final stage: ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222)
20:14:05.984 INFO DAGScheduler - Parents of final stage: List()
20:14:05.984 INFO DAGScheduler - Missing parents: List()
20:14:05.984 INFO DAGScheduler - Submitting ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:06.001 INFO MemoryStore - Block broadcast_194 stored as values in memory (estimated size 426.1 KiB, free 1918.1 MiB)
20:14:06.002 INFO MemoryStore - Block broadcast_194_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.9 MiB)
20:14:06.002 INFO BlockManagerInfo - Added broadcast_194_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:06.003 INFO SparkContext - Created broadcast 194 from broadcast at DAGScheduler.scala:1580
20:14:06.003 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:06.003 INFO TaskSchedulerImpl - Adding task set 108.0 with 2 tasks resource profile 0
20:14:06.003 INFO TaskSetManager - Starting task 0.0 in stage 108.0 (TID 159) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
20:14:06.004 INFO TaskSetManager - Starting task 1.0 in stage 108.0 (TID 160) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
20:14:06.004 INFO Executor - Running task 0.0 in stage 108.0 (TID 159)
20:14:06.004 INFO Executor - Running task 1.0 in stage 108.0 (TID 160)
20:14:06.033 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest38783702903870548621.bam/part-r-00001.bam:0+123314
20:14:06.033 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest38783702903870548621.bam/part-r-00000.bam:0+122169
20:14:06.038 INFO Executor - Finished task 1.0 in stage 108.0 (TID 160). 989 bytes result sent to driver
20:14:06.038 INFO Executor - Finished task 0.0 in stage 108.0 (TID 159). 989 bytes result sent to driver
20:14:06.038 INFO TaskSetManager - Finished task 0.0 in stage 108.0 (TID 159) in 35 ms on localhost (executor driver) (1/2)
20:14:06.038 INFO TaskSetManager - Finished task 1.0 in stage 108.0 (TID 160) in 34 ms on localhost (executor driver) (2/2)
20:14:06.038 INFO TaskSchedulerImpl - Removed TaskSet 108.0, whose tasks have all completed, from pool
20:14:06.039 INFO DAGScheduler - ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.055 s
20:14:06.039 INFO DAGScheduler - Job 76 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.039 INFO TaskSchedulerImpl - Killing all running tasks in stage 108: Stage finished
20:14:06.039 INFO DAGScheduler - Job 76 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.055400 s
20:14:06.042 INFO MemoryStore - Block broadcast_195 stored as values in memory (estimated size 576.0 B, free 1917.9 MiB)
20:14:06.042 INFO MemoryStore - Block broadcast_195_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.9 MiB)
20:14:06.042 INFO BlockManagerInfo - Added broadcast_195_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.5 MiB)
20:14:06.043 INFO SparkContext - Created broadcast 195 from broadcast at CramSource.java:114
20:14:06.044 INFO MemoryStore - Block broadcast_196 stored as values in memory (estimated size 297.9 KiB, free 1917.6 MiB)
20:14:06.050 INFO MemoryStore - Block broadcast_196_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.6 MiB)
20:14:06.050 INFO BlockManagerInfo - Added broadcast_196_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:06.050 INFO SparkContext - Created broadcast 196 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.067 INFO MemoryStore - Block broadcast_197 stored as values in memory (estimated size 576.0 B, free 1917.6 MiB)
20:14:06.067 INFO MemoryStore - Block broadcast_197_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.6 MiB)
20:14:06.067 INFO BlockManagerInfo - Added broadcast_197_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.5 MiB)
20:14:06.067 INFO SparkContext - Created broadcast 197 from broadcast at CramSource.java:114
20:14:06.068 INFO MemoryStore - Block broadcast_198 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
20:14:06.074 INFO MemoryStore - Block broadcast_198_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
20:14:06.074 INFO BlockManagerInfo - Added broadcast_198_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:06.075 INFO SparkContext - Created broadcast 198 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.088 INFO MemoryStore - Block broadcast_199 stored as values in memory (estimated size 6.0 KiB, free 1917.2 MiB)
20:14:06.088 INFO MemoryStore - Block broadcast_199_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.2 MiB)
20:14:06.089 INFO BlockManagerInfo - Added broadcast_199_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.4 MiB)
20:14:06.089 INFO SparkContext - Created broadcast 199 from broadcast at ReadsSparkSink.java:133
20:14:06.089 INFO MemoryStore - Block broadcast_200 stored as values in memory (estimated size 6.2 KiB, free 1917.2 MiB)
20:14:06.090 INFO MemoryStore - Block broadcast_200_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.2 MiB)
20:14:06.090 INFO BlockManagerInfo - Added broadcast_200_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.4 MiB)
20:14:06.090 INFO SparkContext - Created broadcast 200 from broadcast at AnySamSinkMultiple.java:80
20:14:06.092 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.092 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.092 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.104 INFO FileInputFormat - Total input files to process : 1
20:14:06.110 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:06.111 INFO DAGScheduler - Registering RDD 484 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 23
20:14:06.111 INFO DAGScheduler - Got job 77 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:06.111 INFO DAGScheduler - Final stage: ResultStage 110 (runJob at SparkHadoopWriter.scala:83)
20:14:06.111 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 109)
20:14:06.111 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 109)
20:14:06.111 INFO DAGScheduler - Submitting ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:06.125 INFO MemoryStore - Block broadcast_201 stored as values in memory (estimated size 288.4 KiB, free 1917.0 MiB)
20:14:06.126 INFO MemoryStore - Block broadcast_201_piece0 stored as bytes in memory (estimated size 104.7 KiB, free 1916.9 MiB)
20:14:06.126 INFO BlockManagerInfo - Added broadcast_201_piece0 in memory on localhost:35739 (size: 104.7 KiB, free: 1919.3 MiB)
20:14:06.126 INFO SparkContext - Created broadcast 201 from broadcast at DAGScheduler.scala:1580
20:14:06.126 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:06.126 INFO TaskSchedulerImpl - Adding task set 109.0 with 1 tasks resource profile 0
20:14:06.127 INFO TaskSetManager - Starting task 0.0 in stage 109.0 (TID 161) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
20:14:06.127 INFO Executor - Running task 0.0 in stage 109.0 (TID 161)
20:14:06.147 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:06.166 INFO Executor - Finished task 0.0 in stage 109.0 (TID 161). 1235 bytes result sent to driver
20:14:06.167 INFO TaskSetManager - Finished task 0.0 in stage 109.0 (TID 161) in 41 ms on localhost (executor driver) (1/1)
20:14:06.167 INFO TaskSchedulerImpl - Removed TaskSet 109.0, whose tasks have all completed, from pool
20:14:06.167 INFO DAGScheduler - ShuffleMapStage 109 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.056 s
20:14:06.167 INFO DAGScheduler - looking for newly runnable stages
20:14:06.167 INFO DAGScheduler - running: HashSet()
20:14:06.167 INFO DAGScheduler - waiting: HashSet(ResultStage 110)
20:14:06.167 INFO DAGScheduler - failed: HashSet()
20:14:06.167 INFO DAGScheduler - Submitting ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:06.167 INFO BlockManagerInfo - Removed broadcast_189_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.3 MiB)
20:14:06.168 INFO BlockManagerInfo - Removed broadcast_191_piece0 on localhost:35739 in memory (size: 56.3 KiB, free: 1919.4 MiB)
20:14:06.169 INFO BlockManagerInfo - Removed broadcast_188_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.4 MiB)
20:14:06.170 INFO BlockManagerInfo - Removed broadcast_197_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.4 MiB)
20:14:06.171 INFO BlockManagerInfo - Removed broadcast_192_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:06.172 INFO BlockManagerInfo - Removed broadcast_194_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.6 MiB)
20:14:06.172 INFO BlockManagerInfo - Removed broadcast_190_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.7 MiB)
20:14:06.173 INFO BlockManagerInfo - Removed broadcast_193_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.7 MiB)
20:14:06.173 INFO BlockManagerInfo - Removed broadcast_198_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:06.174 INFO BlockManagerInfo - Removed broadcast_186_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:06.178 INFO MemoryStore - Block broadcast_202 stored as values in memory (estimated size 150.3 KiB, free 1919.1 MiB)
20:14:06.179 INFO MemoryStore - Block broadcast_202_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1919.1 MiB)
20:14:06.179 INFO BlockManagerInfo - Added broadcast_202_piece0 in memory on localhost:35739 (size: 56.3 KiB, free: 1919.8 MiB)
20:14:06.179 INFO SparkContext - Created broadcast 202 from broadcast at DAGScheduler.scala:1580
20:14:06.179 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:06.179 INFO TaskSchedulerImpl - Adding task set 110.0 with 2 tasks resource profile 0
20:14:06.180 INFO TaskSetManager - Starting task 0.0 in stage 110.0 (TID 162) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:06.180 INFO TaskSetManager - Starting task 1.0 in stage 110.0 (TID 163) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:06.181 INFO Executor - Running task 0.0 in stage 110.0 (TID 162)
20:14:06.181 INFO Executor - Running task 1.0 in stage 110.0 (TID 163)
20:14:06.185 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.185 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.185 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.185 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.185 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.185 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.186 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.187 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.187 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.187 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.187 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.187 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.196 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.196 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.200 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014062984199605469250951_0495_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest58776090152730308459.cram/_temporary/0/task_202502102014062984199605469250951_0495_r_000000
20:14:06.200 INFO SparkHadoopMapRedUtil - attempt_202502102014062984199605469250951_0495_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:06.200 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.200 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.201 INFO Executor - Finished task 0.0 in stage 110.0 (TID 162). 1729 bytes result sent to driver
20:14:06.201 INFO TaskSetManager - Finished task 0.0 in stage 110.0 (TID 162) in 21 ms on localhost (executor driver) (1/2)
20:14:06.204 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014062984199605469250951_0495_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest58776090152730308459.cram/_temporary/0/task_202502102014062984199605469250951_0495_r_000001
20:14:06.204 INFO SparkHadoopMapRedUtil - attempt_202502102014062984199605469250951_0495_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:06.204 INFO Executor - Finished task 1.0 in stage 110.0 (TID 163). 1729 bytes result sent to driver
20:14:06.204 INFO TaskSetManager - Finished task 1.0 in stage 110.0 (TID 163) in 24 ms on localhost (executor driver) (2/2)
20:14:06.204 INFO TaskSchedulerImpl - Removed TaskSet 110.0, whose tasks have all completed, from pool
20:14:06.205 INFO DAGScheduler - ResultStage 110 (runJob at SparkHadoopWriter.scala:83) finished in 0.036 s
20:14:06.205 INFO DAGScheduler - Job 77 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.205 INFO TaskSchedulerImpl - Killing all running tasks in stage 110: Stage finished
20:14:06.205 INFO DAGScheduler - Job 77 finished: runJob at SparkHadoopWriter.scala:83, took 0.094430 s
20:14:06.205 INFO SparkHadoopWriter - Start to commit write Job job_202502102014062984199605469250951_0495.
20:14:06.211 INFO SparkHadoopWriter - Write Job job_202502102014062984199605469250951_0495 committed. Elapsed time: 5 ms.
20:14:06.213 INFO MemoryStore - Block broadcast_203 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
20:14:06.224 INFO MemoryStore - Block broadcast_203_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
20:14:06.224 INFO BlockManagerInfo - Added broadcast_203_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:06.224 INFO SparkContext - Created broadcast 203 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.253 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:06.253 INFO DAGScheduler - Got job 78 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:06.253 INFO DAGScheduler - Final stage: ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222)
20:14:06.253 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 111)
20:14:06.253 INFO DAGScheduler - Missing parents: List()
20:14:06.253 INFO DAGScheduler - Submitting ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:06.254 INFO MemoryStore - Block broadcast_204 stored as values in memory (estimated size 6.3 KiB, free 1918.7 MiB)
20:14:06.255 INFO MemoryStore - Block broadcast_204_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.7 MiB)
20:14:06.255 INFO BlockManagerInfo - Added broadcast_204_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.7 MiB)
20:14:06.255 INFO SparkContext - Created broadcast 204 from broadcast at DAGScheduler.scala:1580
20:14:06.255 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:06.255 INFO TaskSchedulerImpl - Adding task set 112.0 with 2 tasks resource profile 0
20:14:06.256 INFO TaskSetManager - Starting task 0.0 in stage 112.0 (TID 164) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:06.256 INFO TaskSetManager - Starting task 1.0 in stage 112.0 (TID 165) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:06.256 INFO Executor - Running task 0.0 in stage 112.0 (TID 164)
20:14:06.256 INFO Executor - Running task 1.0 in stage 112.0 (TID 165)
20:14:06.258 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.258 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.258 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.258 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.261 INFO Executor - Finished task 0.0 in stage 112.0 (TID 164). 1591 bytes result sent to driver
20:14:06.261 INFO Executor - Finished task 1.0 in stage 112.0 (TID 165). 1591 bytes result sent to driver
20:14:06.261 INFO TaskSetManager - Finished task 0.0 in stage 112.0 (TID 164) in 5 ms on localhost (executor driver) (1/2)
20:14:06.261 INFO TaskSetManager - Finished task 1.0 in stage 112.0 (TID 165) in 5 ms on localhost (executor driver) (2/2)
20:14:06.261 INFO TaskSchedulerImpl - Removed TaskSet 112.0, whose tasks have all completed, from pool
20:14:06.262 INFO DAGScheduler - ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.007 s
20:14:06.262 INFO DAGScheduler - Job 78 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.262 INFO TaskSchedulerImpl - Killing all running tasks in stage 112: Stage finished
20:14:06.262 INFO DAGScheduler - Job 78 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.009109 s
20:14:06.274 INFO FileInputFormat - Total input files to process : 2
20:14:06.278 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:06.278 INFO DAGScheduler - Got job 79 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:06.278 INFO DAGScheduler - Final stage: ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222)
20:14:06.278 INFO DAGScheduler - Parents of final stage: List()
20:14:06.278 INFO DAGScheduler - Missing parents: List()
20:14:06.278 INFO DAGScheduler - Submitting ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:06.297 INFO MemoryStore - Block broadcast_205 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
20:14:06.299 INFO MemoryStore - Block broadcast_205_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.1 MiB)
20:14:06.299 INFO BlockManagerInfo - Added broadcast_205_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.6 MiB)
20:14:06.299 INFO SparkContext - Created broadcast 205 from broadcast at DAGScheduler.scala:1580
20:14:06.299 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:06.299 INFO TaskSchedulerImpl - Adding task set 113.0 with 2 tasks resource profile 0
20:14:06.300 INFO TaskSetManager - Starting task 0.0 in stage 113.0 (TID 166) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
20:14:06.300 INFO TaskSetManager - Starting task 1.0 in stage 113.0 (TID 167) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
20:14:06.300 INFO Executor - Running task 0.0 in stage 113.0 (TID 166)
20:14:06.300 INFO Executor - Running task 1.0 in stage 113.0 (TID 167)
20:14:06.330 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest58776090152730308459.cram/part-r-00000.bam:0+31473
20:14:06.330 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest58776090152730308459.cram/part-r-00001.bam:0+30825
20:14:06.332 INFO Executor - Finished task 1.0 in stage 113.0 (TID 167). 989 bytes result sent to driver
20:14:06.332 INFO Executor - Finished task 0.0 in stage 113.0 (TID 166). 989 bytes result sent to driver
20:14:06.333 INFO TaskSetManager - Finished task 0.0 in stage 113.0 (TID 166) in 33 ms on localhost (executor driver) (1/2)
20:14:06.333 INFO TaskSetManager - Finished task 1.0 in stage 113.0 (TID 167) in 33 ms on localhost (executor driver) (2/2)
20:14:06.333 INFO TaskSchedulerImpl - Removed TaskSet 113.0, whose tasks have all completed, from pool
20:14:06.333 INFO DAGScheduler - ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.055 s
20:14:06.333 INFO DAGScheduler - Job 79 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.333 INFO TaskSchedulerImpl - Killing all running tasks in stage 113: Stage finished
20:14:06.334 INFO DAGScheduler - Job 79 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.056018 s
20:14:06.336 INFO MemoryStore - Block broadcast_206 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
20:14:06.342 INFO MemoryStore - Block broadcast_206_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
20:14:06.343 INFO BlockManagerInfo - Added broadcast_206_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:06.343 INFO SparkContext - Created broadcast 206 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.366 INFO MemoryStore - Block broadcast_207 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
20:14:06.372 INFO MemoryStore - Block broadcast_207_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
20:14:06.372 INFO BlockManagerInfo - Added broadcast_207_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:06.372 INFO SparkContext - Created broadcast 207 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.392 INFO MemoryStore - Block broadcast_208 stored as values in memory (estimated size 160.7 KiB, free 1917.3 MiB)
20:14:06.393 INFO MemoryStore - Block broadcast_208_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
20:14:06.393 INFO BlockManagerInfo - Added broadcast_208_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:14:06.393 INFO SparkContext - Created broadcast 208 from broadcast at ReadsSparkSink.java:133
20:14:06.394 INFO MemoryStore - Block broadcast_209 stored as values in memory (estimated size 163.2 KiB, free 1917.1 MiB)
20:14:06.395 INFO MemoryStore - Block broadcast_209_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.1 MiB)
20:14:06.395 INFO BlockManagerInfo - Added broadcast_209_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:14:06.395 INFO SparkContext - Created broadcast 209 from broadcast at AnySamSinkMultiple.java:80
20:14:06.397 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.397 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.397 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.408 INFO FileInputFormat - Total input files to process : 1
20:14:06.415 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:06.416 INFO DAGScheduler - Registering RDD 510 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 24
20:14:06.416 INFO DAGScheduler - Got job 80 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
20:14:06.416 INFO DAGScheduler - Final stage: ResultStage 115 (runJob at SparkHadoopWriter.scala:83)
20:14:06.416 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 114)
20:14:06.416 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 114)
20:14:06.416 INFO DAGScheduler - Submitting ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:06.436 INFO MemoryStore - Block broadcast_210 stored as values in memory (estimated size 427.7 KiB, free 1916.7 MiB)
20:14:06.437 INFO MemoryStore - Block broadcast_210_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.6 MiB)
20:14:06.437 INFO BlockManagerInfo - Added broadcast_210_piece0 in memory on localhost:35739 (size: 154.6 KiB, free: 1919.3 MiB)
20:14:06.438 INFO SparkContext - Created broadcast 210 from broadcast at DAGScheduler.scala:1580
20:14:06.438 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
20:14:06.438 INFO TaskSchedulerImpl - Adding task set 114.0 with 1 tasks resource profile 0
20:14:06.439 INFO TaskSetManager - Starting task 0.0 in stage 114.0 (TID 168) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:06.439 INFO Executor - Running task 0.0 in stage 114.0 (TID 168)
20:14:06.473 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:06.490 INFO Executor - Finished task 0.0 in stage 114.0 (TID 168). 1149 bytes result sent to driver
20:14:06.491 INFO TaskSetManager - Finished task 0.0 in stage 114.0 (TID 168) in 53 ms on localhost (executor driver) (1/1)
20:14:06.491 INFO TaskSchedulerImpl - Removed TaskSet 114.0, whose tasks have all completed, from pool
20:14:06.491 INFO DAGScheduler - ShuffleMapStage 114 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.074 s
20:14:06.491 INFO DAGScheduler - looking for newly runnable stages
20:14:06.491 INFO DAGScheduler - running: HashSet()
20:14:06.491 INFO DAGScheduler - waiting: HashSet(ResultStage 115)
20:14:06.491 INFO DAGScheduler - failed: HashSet()
20:14:06.491 INFO DAGScheduler - Submitting ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
20:14:06.498 INFO MemoryStore - Block broadcast_211 stored as values in memory (estimated size 150.2 KiB, free 1916.4 MiB)
20:14:06.498 INFO MemoryStore - Block broadcast_211_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1916.4 MiB)
20:14:06.498 INFO BlockManagerInfo - Added broadcast_211_piece0 in memory on localhost:35739 (size: 56.2 KiB, free: 1919.3 MiB)
20:14:06.499 INFO SparkContext - Created broadcast 211 from broadcast at DAGScheduler.scala:1580
20:14:06.499 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
20:14:06.499 INFO TaskSchedulerImpl - Adding task set 115.0 with 2 tasks resource profile 0
20:14:06.499 INFO TaskSetManager - Starting task 0.0 in stage 115.0 (TID 169) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:06.500 INFO TaskSetManager - Starting task 1.0 in stage 115.0 (TID 170) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:06.500 INFO Executor - Running task 1.0 in stage 115.0 (TID 170)
20:14:06.500 INFO Executor - Running task 0.0 in stage 115.0 (TID 169)
20:14:06.506 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.506 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.506 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.506 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.515 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.515 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.519 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.519 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.521 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014066082552913860249354_0522_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest613826244703645989554.sam/_temporary/0/task_202502102014066082552913860249354_0522_r_000001
20:14:06.521 INFO SparkHadoopMapRedUtil - attempt_202502102014066082552913860249354_0522_r_000001_0: Committed. Elapsed time: 0 ms.
20:14:06.522 INFO Executor - Finished task 1.0 in stage 115.0 (TID 170). 1729 bytes result sent to driver
20:14:06.522 INFO TaskSetManager - Finished task 1.0 in stage 115.0 (TID 170) in 23 ms on localhost (executor driver) (1/2)
20:14:06.526 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014066082552913860249354_0522_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest613826244703645989554.sam/_temporary/0/task_202502102014066082552913860249354_0522_r_000000
20:14:06.526 INFO SparkHadoopMapRedUtil - attempt_202502102014066082552913860249354_0522_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:06.526 INFO Executor - Finished task 0.0 in stage 115.0 (TID 169). 1729 bytes result sent to driver
20:14:06.526 INFO TaskSetManager - Finished task 0.0 in stage 115.0 (TID 169) in 27 ms on localhost (executor driver) (2/2)
20:14:06.527 INFO TaskSchedulerImpl - Removed TaskSet 115.0, whose tasks have all completed, from pool
20:14:06.527 INFO DAGScheduler - ResultStage 115 (runJob at SparkHadoopWriter.scala:83) finished in 0.035 s
20:14:06.527 INFO DAGScheduler - Job 80 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.527 INFO TaskSchedulerImpl - Killing all running tasks in stage 115: Stage finished
20:14:06.527 INFO DAGScheduler - Job 80 finished: runJob at SparkHadoopWriter.scala:83, took 0.112254 s
20:14:06.527 INFO SparkHadoopWriter - Start to commit write Job job_202502102014066082552913860249354_0522.
20:14:06.533 INFO SparkHadoopWriter - Write Job job_202502102014066082552913860249354_0522 committed. Elapsed time: 5 ms.
20:14:06.536 INFO MemoryStore - Block broadcast_212 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
20:14:06.546 INFO MemoryStore - Block broadcast_212_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
20:14:06.546 INFO BlockManagerInfo - Added broadcast_212_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:06.547 INFO SparkContext - Created broadcast 212 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.571 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:06.572 INFO DAGScheduler - Got job 81 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:06.572 INFO DAGScheduler - Final stage: ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222)
20:14:06.572 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 116)
20:14:06.572 INFO DAGScheduler - Missing parents: List()
20:14:06.572 INFO DAGScheduler - Submitting ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
20:14:06.573 INFO MemoryStore - Block broadcast_213 stored as values in memory (estimated size 6.3 KiB, free 1916.0 MiB)
20:14:06.573 INFO MemoryStore - Block broadcast_213_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.0 MiB)
20:14:06.573 INFO BlockManagerInfo - Added broadcast_213_piece0 in memory on localhost:35739 (size: 3.4 KiB, free: 1919.2 MiB)
20:14:06.574 INFO SparkContext - Created broadcast 213 from broadcast at DAGScheduler.scala:1580
20:14:06.574 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
20:14:06.574 INFO TaskSchedulerImpl - Adding task set 117.0 with 2 tasks resource profile 0
20:14:06.574 INFO TaskSetManager - Starting task 0.0 in stage 117.0 (TID 171) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
20:14:06.575 INFO TaskSetManager - Starting task 1.0 in stage 117.0 (TID 172) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
20:14:06.575 INFO Executor - Running task 1.0 in stage 117.0 (TID 172)
20:14:06.575 INFO Executor - Running task 0.0 in stage 117.0 (TID 171)
20:14:06.576 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.576 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.576 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.577 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.586 INFO Executor - Finished task 1.0 in stage 117.0 (TID 172). 1677 bytes result sent to driver
20:14:06.586 INFO TaskSetManager - Finished task 1.0 in stage 117.0 (TID 172) in 11 ms on localhost (executor driver) (1/2)
20:14:06.588 INFO Executor - Finished task 0.0 in stage 117.0 (TID 171). 1634 bytes result sent to driver
20:14:06.588 INFO TaskSetManager - Finished task 0.0 in stage 117.0 (TID 171) in 14 ms on localhost (executor driver) (2/2)
20:14:06.588 INFO TaskSchedulerImpl - Removed TaskSet 117.0, whose tasks have all completed, from pool
20:14:06.588 INFO BlockManagerInfo - Removed broadcast_202_piece0 on localhost:35739 in memory (size: 56.3 KiB, free: 1919.3 MiB)
20:14:06.588 INFO DAGScheduler - ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.016 s
20:14:06.588 INFO DAGScheduler - Job 81 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.588 INFO TaskSchedulerImpl - Killing all running tasks in stage 117: Stage finished
20:14:06.589 INFO DAGScheduler - Job 81 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.017452 s
20:14:06.591 INFO BlockManagerInfo - Removed broadcast_209_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.3 MiB)
20:14:06.591 INFO BlockManagerInfo - Removed broadcast_199_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.3 MiB)
20:14:06.592 INFO BlockManagerInfo - Removed broadcast_207_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:06.593 INFO BlockManagerInfo - Removed broadcast_201_piece0 on localhost:35739 in memory (size: 104.7 KiB, free: 1919.4 MiB)
20:14:06.595 INFO BlockManagerInfo - Removed broadcast_195_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.4 MiB)
20:14:06.596 INFO BlockManagerInfo - Removed broadcast_196_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:06.597 INFO BlockManagerInfo - Removed broadcast_211_piece0 on localhost:35739 in memory (size: 56.2 KiB, free: 1919.5 MiB)
20:14:06.597 INFO BlockManagerInfo - Removed broadcast_204_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.5 MiB)
20:14:06.599 INFO BlockManagerInfo - Removed broadcast_210_piece0 on localhost:35739 in memory (size: 154.6 KiB, free: 1919.7 MiB)
20:14:06.599 INFO BlockManagerInfo - Removed broadcast_208_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:14:06.600 INFO BlockManagerInfo - Removed broadcast_205_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:14:06.600 INFO BlockManagerInfo - Removed broadcast_200_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.8 MiB)
20:14:06.601 INFO BlockManagerInfo - Removed broadcast_203_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:06.607 INFO FileInputFormat - Total input files to process : 2
20:14:06.612 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
20:14:06.612 INFO DAGScheduler - Got job 82 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
20:14:06.612 INFO DAGScheduler - Final stage: ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222)
20:14:06.612 INFO DAGScheduler - Parents of final stage: List()
20:14:06.613 INFO DAGScheduler - Missing parents: List()
20:14:06.613 INFO DAGScheduler - Submitting ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:06.630 INFO MemoryStore - Block broadcast_214 stored as values in memory (estimated size 426.1 KiB, free 1918.9 MiB)
20:14:06.632 INFO MemoryStore - Block broadcast_214_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.7 MiB)
20:14:06.632 INFO BlockManagerInfo - Added broadcast_214_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:14:06.632 INFO SparkContext - Created broadcast 214 from broadcast at DAGScheduler.scala:1580
20:14:06.632 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
20:14:06.632 INFO TaskSchedulerImpl - Adding task set 118.0 with 2 tasks resource profile 0
20:14:06.633 INFO TaskSetManager - Starting task 0.0 in stage 118.0 (TID 173) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
20:14:06.633 INFO TaskSetManager - Starting task 1.0 in stage 118.0 (TID 174) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
20:14:06.633 INFO Executor - Running task 0.0 in stage 118.0 (TID 173)
20:14:06.633 INFO Executor - Running task 1.0 in stage 118.0 (TID 174)
20:14:06.663 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest613826244703645989554.sam/part-r-00001.bam:0+129330
20:14:06.663 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest613826244703645989554.sam/part-r-00000.bam:0+132492
20:14:06.672 INFO Executor - Finished task 0.0 in stage 118.0 (TID 173). 989 bytes result sent to driver
20:14:06.672 INFO TaskSetManager - Finished task 0.0 in stage 118.0 (TID 173) in 39 ms on localhost (executor driver) (1/2)
20:14:06.673 INFO Executor - Finished task 1.0 in stage 118.0 (TID 174). 989 bytes result sent to driver
20:14:06.673 INFO TaskSetManager - Finished task 1.0 in stage 118.0 (TID 174) in 40 ms on localhost (executor driver) (2/2)
20:14:06.673 INFO TaskSchedulerImpl - Removed TaskSet 118.0, whose tasks have all completed, from pool
20:14:06.673 INFO DAGScheduler - ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.060 s
20:14:06.673 INFO DAGScheduler - Job 82 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.673 INFO TaskSchedulerImpl - Killing all running tasks in stage 118: Stage finished
20:14:06.673 INFO DAGScheduler - Job 82 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.061151 s
20:14:06.677 INFO MemoryStore - Block broadcast_215 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
20:14:06.683 INFO MemoryStore - Block broadcast_215_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
20:14:06.683 INFO BlockManagerInfo - Added broadcast_215_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:06.683 INFO SparkContext - Created broadcast 215 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.705 INFO MemoryStore - Block broadcast_216 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:06.711 INFO MemoryStore - Block broadcast_216_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
20:14:06.711 INFO BlockManagerInfo - Added broadcast_216_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:06.712 INFO SparkContext - Created broadcast 216 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.732 INFO FileInputFormat - Total input files to process : 1
20:14:06.734 INFO MemoryStore - Block broadcast_217 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
20:14:06.735 INFO MemoryStore - Block broadcast_217_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
20:14:06.735 INFO BlockManagerInfo - Added broadcast_217_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:06.735 INFO SparkContext - Created broadcast 217 from broadcast at ReadsSparkSink.java:133
20:14:06.737 INFO MemoryStore - Block broadcast_218 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
20:14:06.737 INFO MemoryStore - Block broadcast_218_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
20:14:06.737 INFO BlockManagerInfo - Added broadcast_218_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:06.738 INFO SparkContext - Created broadcast 218 from broadcast at BamSink.java:76
20:14:06.739 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.739 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.739 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.758 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:06.758 INFO DAGScheduler - Registering RDD 543 (mapToPair at SparkUtils.java:161) as input to shuffle 25
20:14:06.759 INFO DAGScheduler - Got job 83 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:06.759 INFO DAGScheduler - Final stage: ResultStage 120 (runJob at SparkHadoopWriter.scala:83)
20:14:06.759 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 119)
20:14:06.759 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 119)
20:14:06.759 INFO DAGScheduler - Submitting ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:06.776 INFO MemoryStore - Block broadcast_219 stored as values in memory (estimated size 520.4 KiB, free 1917.2 MiB)
20:14:06.777 INFO MemoryStore - Block broadcast_219_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.1 MiB)
20:14:06.777 INFO BlockManagerInfo - Added broadcast_219_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.5 MiB)
20:14:06.778 INFO SparkContext - Created broadcast 219 from broadcast at DAGScheduler.scala:1580
20:14:06.778 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:06.778 INFO TaskSchedulerImpl - Adding task set 119.0 with 1 tasks resource profile 0
20:14:06.778 INFO TaskSetManager - Starting task 0.0 in stage 119.0 (TID 175) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:06.779 INFO Executor - Running task 0.0 in stage 119.0 (TID 175)
20:14:06.809 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:06.825 INFO Executor - Finished task 0.0 in stage 119.0 (TID 175). 1148 bytes result sent to driver
20:14:06.825 INFO TaskSetManager - Finished task 0.0 in stage 119.0 (TID 175) in 47 ms on localhost (executor driver) (1/1)
20:14:06.825 INFO TaskSchedulerImpl - Removed TaskSet 119.0, whose tasks have all completed, from pool
20:14:06.825 INFO DAGScheduler - ShuffleMapStage 119 (mapToPair at SparkUtils.java:161) finished in 0.066 s
20:14:06.826 INFO DAGScheduler - looking for newly runnable stages
20:14:06.826 INFO DAGScheduler - running: HashSet()
20:14:06.826 INFO DAGScheduler - waiting: HashSet(ResultStage 120)
20:14:06.826 INFO DAGScheduler - failed: HashSet()
20:14:06.826 INFO DAGScheduler - Submitting ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91), which has no missing parents
20:14:06.832 INFO MemoryStore - Block broadcast_220 stored as values in memory (estimated size 241.4 KiB, free 1916.8 MiB)
20:14:06.833 INFO MemoryStore - Block broadcast_220_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.8 MiB)
20:14:06.833 INFO BlockManagerInfo - Added broadcast_220_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.4 MiB)
20:14:06.833 INFO SparkContext - Created broadcast 220 from broadcast at DAGScheduler.scala:1580
20:14:06.834 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:06.834 INFO TaskSchedulerImpl - Adding task set 120.0 with 1 tasks resource profile 0
20:14:06.834 INFO TaskSetManager - Starting task 0.0 in stage 120.0 (TID 176) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:06.835 INFO Executor - Running task 0.0 in stage 120.0 (TID 176)
20:14:06.839 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:06.839 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:06.851 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.851 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.851 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.851 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:06.851 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:06.851 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:06.881 INFO FileOutputCommitter - Saved output of task 'attempt_20250210201406737923743480918345_0548_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest18755370654995842527.bam.parts/_temporary/0/task_20250210201406737923743480918345_0548_r_000000
20:14:06.881 INFO SparkHadoopMapRedUtil - attempt_20250210201406737923743480918345_0548_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:06.882 INFO Executor - Finished task 0.0 in stage 120.0 (TID 176). 1858 bytes result sent to driver
20:14:06.882 INFO TaskSetManager - Finished task 0.0 in stage 120.0 (TID 176) in 48 ms on localhost (executor driver) (1/1)
20:14:06.882 INFO TaskSchedulerImpl - Removed TaskSet 120.0, whose tasks have all completed, from pool
20:14:06.882 INFO DAGScheduler - ResultStage 120 (runJob at SparkHadoopWriter.scala:83) finished in 0.056 s
20:14:06.882 INFO DAGScheduler - Job 83 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.882 INFO TaskSchedulerImpl - Killing all running tasks in stage 120: Stage finished
20:14:06.883 INFO DAGScheduler - Job 83 finished: runJob at SparkHadoopWriter.scala:83, took 0.124786 s
20:14:06.883 INFO SparkHadoopWriter - Start to commit write Job job_20250210201406737923743480918345_0548.
20:14:06.889 INFO SparkHadoopWriter - Write Job job_20250210201406737923743480918345_0548 committed. Elapsed time: 6 ms.
20:14:06.901 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest18755370654995842527.bam
20:14:06.906 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest18755370654995842527.bam done
20:14:06.906 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest18755370654995842527.bam.parts/ to /tmp/ReadsSparkSinkUnitTest18755370654995842527.bam.sbi
20:14:06.911 INFO IndexFileMerger - Done merging .sbi files
20:14:06.911 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest18755370654995842527.bam.parts/ to /tmp/ReadsSparkSinkUnitTest18755370654995842527.bam.bai
20:14:06.917 INFO IndexFileMerger - Done merging .bai files
20:14:06.919 INFO MemoryStore - Block broadcast_221 stored as values in memory (estimated size 320.0 B, free 1916.8 MiB)
20:14:06.919 INFO MemoryStore - Block broadcast_221_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.8 MiB)
20:14:06.920 INFO BlockManagerInfo - Added broadcast_221_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.4 MiB)
20:14:06.920 INFO SparkContext - Created broadcast 221 from broadcast at BamSource.java:104
20:14:06.921 INFO MemoryStore - Block broadcast_222 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
20:14:06.927 INFO MemoryStore - Block broadcast_222_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:06.927 INFO BlockManagerInfo - Added broadcast_222_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:06.927 INFO SparkContext - Created broadcast 222 from newAPIHadoopFile at PathSplitSource.java:96
20:14:06.936 INFO FileInputFormat - Total input files to process : 1
20:14:06.950 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:06.951 INFO DAGScheduler - Got job 84 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:06.951 INFO DAGScheduler - Final stage: ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:06.951 INFO DAGScheduler - Parents of final stage: List()
20:14:06.951 INFO DAGScheduler - Missing parents: List()
20:14:06.951 INFO DAGScheduler - Submitting ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:06.961 INFO MemoryStore - Block broadcast_223 stored as values in memory (estimated size 148.2 KiB, free 1916.3 MiB)
20:14:06.962 INFO MemoryStore - Block broadcast_223_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.2 MiB)
20:14:06.962 INFO BlockManagerInfo - Added broadcast_223_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.3 MiB)
20:14:06.962 INFO SparkContext - Created broadcast 223 from broadcast at DAGScheduler.scala:1580
20:14:06.962 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:06.962 INFO TaskSchedulerImpl - Adding task set 121.0 with 1 tasks resource profile 0
20:14:06.963 INFO TaskSetManager - Starting task 0.0 in stage 121.0 (TID 177) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:06.963 INFO Executor - Running task 0.0 in stage 121.0 (TID 177)
20:14:06.979 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest18755370654995842527.bam:0+237038
20:14:06.983 INFO Executor - Finished task 0.0 in stage 121.0 (TID 177). 651483 bytes result sent to driver
20:14:06.985 INFO TaskSetManager - Finished task 0.0 in stage 121.0 (TID 177) in 22 ms on localhost (executor driver) (1/1)
20:14:06.985 INFO TaskSchedulerImpl - Removed TaskSet 121.0, whose tasks have all completed, from pool
20:14:06.985 INFO DAGScheduler - ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.034 s
20:14:06.985 INFO DAGScheduler - Job 84 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:06.985 INFO TaskSchedulerImpl - Killing all running tasks in stage 121: Stage finished
20:14:06.985 INFO DAGScheduler - Job 84 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.034768 s
20:14:07.000 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:07.000 INFO DAGScheduler - Got job 85 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:07.000 INFO DAGScheduler - Final stage: ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185)
20:14:07.000 INFO DAGScheduler - Parents of final stage: List()
20:14:07.001 INFO DAGScheduler - Missing parents: List()
20:14:07.001 INFO DAGScheduler - Submitting ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.017 INFO MemoryStore - Block broadcast_224 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
20:14:07.019 INFO MemoryStore - Block broadcast_224_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
20:14:07.019 INFO BlockManagerInfo - Added broadcast_224_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.2 MiB)
20:14:07.019 INFO SparkContext - Created broadcast 224 from broadcast at DAGScheduler.scala:1580
20:14:07.019 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.019 INFO TaskSchedulerImpl - Adding task set 122.0 with 1 tasks resource profile 0
20:14:07.020 INFO TaskSetManager - Starting task 0.0 in stage 122.0 (TID 178) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:07.020 INFO Executor - Running task 0.0 in stage 122.0 (TID 178)
20:14:07.054 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:07.064 INFO Executor - Finished task 0.0 in stage 122.0 (TID 178). 989 bytes result sent to driver
20:14:07.064 INFO TaskSetManager - Finished task 0.0 in stage 122.0 (TID 178) in 44 ms on localhost (executor driver) (1/1)
20:14:07.064 INFO TaskSchedulerImpl - Removed TaskSet 122.0, whose tasks have all completed, from pool
20:14:07.064 INFO DAGScheduler - ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
20:14:07.065 INFO DAGScheduler - Job 85 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.065 INFO TaskSchedulerImpl - Killing all running tasks in stage 122: Stage finished
20:14:07.065 INFO DAGScheduler - Job 85 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.064476 s
20:14:07.068 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:07.068 INFO DAGScheduler - Got job 86 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:07.068 INFO DAGScheduler - Final stage: ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185)
20:14:07.068 INFO DAGScheduler - Parents of final stage: List()
20:14:07.068 INFO DAGScheduler - Missing parents: List()
20:14:07.069 INFO DAGScheduler - Submitting ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.075 INFO MemoryStore - Block broadcast_225 stored as values in memory (estimated size 148.1 KiB, free 1915.5 MiB)
20:14:07.075 INFO MemoryStore - Block broadcast_225_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.5 MiB)
20:14:07.076 INFO BlockManagerInfo - Added broadcast_225_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.1 MiB)
20:14:07.076 INFO SparkContext - Created broadcast 225 from broadcast at DAGScheduler.scala:1580
20:14:07.076 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.076 INFO TaskSchedulerImpl - Adding task set 123.0 with 1 tasks resource profile 0
20:14:07.077 INFO TaskSetManager - Starting task 0.0 in stage 123.0 (TID 179) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:07.077 INFO Executor - Running task 0.0 in stage 123.0 (TID 179)
20:14:07.088 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest18755370654995842527.bam:0+237038
20:14:07.092 INFO Executor - Finished task 0.0 in stage 123.0 (TID 179). 989 bytes result sent to driver
20:14:07.092 INFO TaskSetManager - Finished task 0.0 in stage 123.0 (TID 179) in 16 ms on localhost (executor driver) (1/1)
20:14:07.092 INFO TaskSchedulerImpl - Removed TaskSet 123.0, whose tasks have all completed, from pool
20:14:07.092 INFO DAGScheduler - ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
20:14:07.092 INFO DAGScheduler - Job 86 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.092 INFO TaskSchedulerImpl - Killing all running tasks in stage 123: Stage finished
20:14:07.092 INFO DAGScheduler - Job 86 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.024326 s
20:14:07.095 INFO MemoryStore - Block broadcast_226 stored as values in memory (estimated size 297.9 KiB, free 1915.2 MiB)
20:14:07.105 INFO BlockManagerInfo - Removed broadcast_212_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:14:07.106 INFO BlockManagerInfo - Removed broadcast_218_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.2 MiB)
20:14:07.107 INFO BlockManagerInfo - Removed broadcast_220_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.2 MiB)
20:14:07.108 INFO BlockManagerInfo - Removed broadcast_219_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.4 MiB)
20:14:07.108 INFO BlockManagerInfo - Removed broadcast_213_piece0 on localhost:35739 in memory (size: 3.4 KiB, free: 1919.4 MiB)
20:14:07.109 INFO BlockManagerInfo - Removed broadcast_225_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.4 MiB)
20:14:07.110 INFO BlockManagerInfo - Removed broadcast_215_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:07.110 INFO BlockManagerInfo - Removed broadcast_217_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:07.111 INFO BlockManagerInfo - Removed broadcast_216_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:07.111 INFO BlockManagerInfo - Removed broadcast_206_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:07.112 INFO MemoryStore - Block broadcast_226_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:07.112 INFO BlockManagerInfo - Added broadcast_226_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:07.112 INFO BlockManagerInfo - Removed broadcast_224_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:07.113 INFO SparkContext - Created broadcast 226 from newAPIHadoopFile at PathSplitSource.java:96
20:14:07.114 INFO BlockManagerInfo - Removed broadcast_222_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:07.117 INFO BlockManagerInfo - Removed broadcast_223_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.8 MiB)
20:14:07.118 INFO BlockManagerInfo - Removed broadcast_221_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.8 MiB)
20:14:07.118 INFO BlockManagerInfo - Removed broadcast_214_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1920.0 MiB)
20:14:07.147 INFO MemoryStore - Block broadcast_227 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
20:14:07.153 INFO MemoryStore - Block broadcast_227_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:14:07.153 INFO BlockManagerInfo - Added broadcast_227_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:07.153 INFO SparkContext - Created broadcast 227 from newAPIHadoopFile at PathSplitSource.java:96
20:14:07.174 INFO FileInputFormat - Total input files to process : 1
20:14:07.176 INFO MemoryStore - Block broadcast_228 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
20:14:07.176 INFO MemoryStore - Block broadcast_228_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
20:14:07.176 INFO BlockManagerInfo - Added broadcast_228_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:14:07.177 INFO SparkContext - Created broadcast 228 from broadcast at ReadsSparkSink.java:133
20:14:07.178 INFO MemoryStore - Block broadcast_229 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
20:14:07.178 INFO MemoryStore - Block broadcast_229_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
20:14:07.179 INFO BlockManagerInfo - Added broadcast_229_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:14:07.179 INFO SparkContext - Created broadcast 229 from broadcast at BamSink.java:76
20:14:07.180 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:07.180 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:07.180 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:07.198 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:07.198 INFO DAGScheduler - Registering RDD 568 (mapToPair at SparkUtils.java:161) as input to shuffle 26
20:14:07.198 INFO DAGScheduler - Got job 87 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:07.198 INFO DAGScheduler - Final stage: ResultStage 125 (runJob at SparkHadoopWriter.scala:83)
20:14:07.198 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 124)
20:14:07.198 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 124)
20:14:07.199 INFO DAGScheduler - Submitting ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:07.215 INFO MemoryStore - Block broadcast_230 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
20:14:07.217 INFO MemoryStore - Block broadcast_230_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
20:14:07.217 INFO BlockManagerInfo - Added broadcast_230_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.7 MiB)
20:14:07.217 INFO SparkContext - Created broadcast 230 from broadcast at DAGScheduler.scala:1580
20:14:07.217 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:07.217 INFO TaskSchedulerImpl - Adding task set 124.0 with 1 tasks resource profile 0
20:14:07.218 INFO TaskSetManager - Starting task 0.0 in stage 124.0 (TID 180) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:07.218 INFO Executor - Running task 0.0 in stage 124.0 (TID 180)
20:14:07.248 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:07.264 INFO Executor - Finished task 0.0 in stage 124.0 (TID 180). 1148 bytes result sent to driver
20:14:07.265 INFO TaskSetManager - Finished task 0.0 in stage 124.0 (TID 180) in 47 ms on localhost (executor driver) (1/1)
20:14:07.265 INFO TaskSchedulerImpl - Removed TaskSet 124.0, whose tasks have all completed, from pool
20:14:07.265 INFO DAGScheduler - ShuffleMapStage 124 (mapToPair at SparkUtils.java:161) finished in 0.066 s
20:14:07.265 INFO DAGScheduler - looking for newly runnable stages
20:14:07.265 INFO DAGScheduler - running: HashSet()
20:14:07.265 INFO DAGScheduler - waiting: HashSet(ResultStage 125)
20:14:07.265 INFO DAGScheduler - failed: HashSet()
20:14:07.265 INFO DAGScheduler - Submitting ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91), which has no missing parents
20:14:07.272 INFO MemoryStore - Block broadcast_231 stored as values in memory (estimated size 241.4 KiB, free 1918.1 MiB)
20:14:07.273 INFO MemoryStore - Block broadcast_231_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.0 MiB)
20:14:07.273 INFO BlockManagerInfo - Added broadcast_231_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.7 MiB)
20:14:07.273 INFO SparkContext - Created broadcast 231 from broadcast at DAGScheduler.scala:1580
20:14:07.273 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:07.273 INFO TaskSchedulerImpl - Adding task set 125.0 with 1 tasks resource profile 0
20:14:07.274 INFO TaskSetManager - Starting task 0.0 in stage 125.0 (TID 181) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:07.274 INFO Executor - Running task 0.0 in stage 125.0 (TID 181)
20:14:07.279 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:07.279 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:07.291 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:07.291 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:07.291 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:07.291 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:07.291 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:07.291 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:07.315 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014072981663850323189766_0573_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest111889518754968235219.bam.parts/_temporary/0/task_202502102014072981663850323189766_0573_r_000000
20:14:07.316 INFO SparkHadoopMapRedUtil - attempt_202502102014072981663850323189766_0573_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:07.316 INFO Executor - Finished task 0.0 in stage 125.0 (TID 181). 1858 bytes result sent to driver
20:14:07.316 INFO TaskSetManager - Finished task 0.0 in stage 125.0 (TID 181) in 42 ms on localhost (executor driver) (1/1)
20:14:07.317 INFO TaskSchedulerImpl - Removed TaskSet 125.0, whose tasks have all completed, from pool
20:14:07.317 INFO DAGScheduler - ResultStage 125 (runJob at SparkHadoopWriter.scala:83) finished in 0.051 s
20:14:07.317 INFO DAGScheduler - Job 87 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.317 INFO TaskSchedulerImpl - Killing all running tasks in stage 125: Stage finished
20:14:07.317 INFO DAGScheduler - Job 87 finished: runJob at SparkHadoopWriter.scala:83, took 0.119233 s
20:14:07.317 INFO SparkHadoopWriter - Start to commit write Job job_202502102014072981663850323189766_0573.
20:14:07.323 INFO SparkHadoopWriter - Write Job job_202502102014072981663850323189766_0573 committed. Elapsed time: 5 ms.
20:14:07.335 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest111889518754968235219.bam
20:14:07.339 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest111889518754968235219.bam done
20:14:07.339 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest111889518754968235219.bam.parts/ to /tmp/ReadsSparkSinkUnitTest111889518754968235219.bam.sbi
20:14:07.345 INFO IndexFileMerger - Done merging .sbi files
20:14:07.345 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest111889518754968235219.bam.parts/ to /tmp/ReadsSparkSinkUnitTest111889518754968235219.bam.bai
20:14:07.350 INFO IndexFileMerger - Done merging .bai files
20:14:07.353 INFO MemoryStore - Block broadcast_232 stored as values in memory (estimated size 13.3 KiB, free 1918.0 MiB)
20:14:07.354 INFO MemoryStore - Block broadcast_232_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.0 MiB)
20:14:07.354 INFO BlockManagerInfo - Added broadcast_232_piece0 in memory on localhost:35739 (size: 8.3 KiB, free: 1919.6 MiB)
20:14:07.354 INFO SparkContext - Created broadcast 232 from broadcast at BamSource.java:104
20:14:07.355 INFO MemoryStore - Block broadcast_233 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
20:14:07.366 INFO MemoryStore - Block broadcast_233_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
20:14:07.366 INFO BlockManagerInfo - Added broadcast_233_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:07.366 INFO SparkContext - Created broadcast 233 from newAPIHadoopFile at PathSplitSource.java:96
20:14:07.380 INFO FileInputFormat - Total input files to process : 1
20:14:07.395 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:07.395 INFO DAGScheduler - Got job 88 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:07.395 INFO DAGScheduler - Final stage: ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:07.395 INFO DAGScheduler - Parents of final stage: List()
20:14:07.395 INFO DAGScheduler - Missing parents: List()
20:14:07.395 INFO DAGScheduler - Submitting ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.401 INFO MemoryStore - Block broadcast_234 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
20:14:07.402 INFO MemoryStore - Block broadcast_234_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
20:14:07.402 INFO BlockManagerInfo - Added broadcast_234_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.5 MiB)
20:14:07.403 INFO SparkContext - Created broadcast 234 from broadcast at DAGScheduler.scala:1580
20:14:07.403 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.403 INFO TaskSchedulerImpl - Adding task set 126.0 with 1 tasks resource profile 0
20:14:07.403 INFO TaskSetManager - Starting task 0.0 in stage 126.0 (TID 182) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
20:14:07.404 INFO Executor - Running task 0.0 in stage 126.0 (TID 182)
20:14:07.416 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111889518754968235219.bam:0+237038
20:14:07.421 INFO Executor - Finished task 0.0 in stage 126.0 (TID 182). 651483 bytes result sent to driver
20:14:07.422 INFO TaskSetManager - Finished task 0.0 in stage 126.0 (TID 182) in 19 ms on localhost (executor driver) (1/1)
20:14:07.422 INFO TaskSchedulerImpl - Removed TaskSet 126.0, whose tasks have all completed, from pool
20:14:07.423 INFO DAGScheduler - ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
20:14:07.423 INFO DAGScheduler - Job 88 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.423 INFO TaskSchedulerImpl - Killing all running tasks in stage 126: Stage finished
20:14:07.423 INFO DAGScheduler - Job 88 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.027937 s
20:14:07.432 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:07.432 INFO DAGScheduler - Got job 89 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:07.432 INFO DAGScheduler - Final stage: ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185)
20:14:07.432 INFO DAGScheduler - Parents of final stage: List()
20:14:07.432 INFO DAGScheduler - Missing parents: List()
20:14:07.432 INFO DAGScheduler - Submitting ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.449 INFO MemoryStore - Block broadcast_235 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
20:14:07.450 INFO MemoryStore - Block broadcast_235_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
20:14:07.451 INFO BlockManagerInfo - Added broadcast_235_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.4 MiB)
20:14:07.451 INFO SparkContext - Created broadcast 235 from broadcast at DAGScheduler.scala:1580
20:14:07.451 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.451 INFO TaskSchedulerImpl - Adding task set 127.0 with 1 tasks resource profile 0
20:14:07.451 INFO TaskSetManager - Starting task 0.0 in stage 127.0 (TID 183) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:07.452 INFO Executor - Running task 0.0 in stage 127.0 (TID 183)
20:14:07.481 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:07.491 INFO Executor - Finished task 0.0 in stage 127.0 (TID 183). 989 bytes result sent to driver
20:14:07.491 INFO TaskSetManager - Finished task 0.0 in stage 127.0 (TID 183) in 40 ms on localhost (executor driver) (1/1)
20:14:07.491 INFO TaskSchedulerImpl - Removed TaskSet 127.0, whose tasks have all completed, from pool
20:14:07.491 INFO DAGScheduler - ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
20:14:07.491 INFO DAGScheduler - Job 89 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.491 INFO TaskSchedulerImpl - Killing all running tasks in stage 127: Stage finished
20:14:07.491 INFO DAGScheduler - Job 89 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059467 s
20:14:07.495 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:07.495 INFO DAGScheduler - Got job 90 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:07.495 INFO DAGScheduler - Final stage: ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185)
20:14:07.495 INFO DAGScheduler - Parents of final stage: List()
20:14:07.495 INFO DAGScheduler - Missing parents: List()
20:14:07.495 INFO DAGScheduler - Submitting ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.501 INFO MemoryStore - Block broadcast_236 stored as values in memory (estimated size 148.1 KiB, free 1916.7 MiB)
20:14:07.502 INFO MemoryStore - Block broadcast_236_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1916.7 MiB)
20:14:07.502 INFO BlockManagerInfo - Added broadcast_236_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.3 MiB)
20:14:07.502 INFO SparkContext - Created broadcast 236 from broadcast at DAGScheduler.scala:1580
20:14:07.502 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.502 INFO TaskSchedulerImpl - Adding task set 128.0 with 1 tasks resource profile 0
20:14:07.503 INFO TaskSetManager - Starting task 0.0 in stage 128.0 (TID 184) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
20:14:07.503 INFO Executor - Running task 0.0 in stage 128.0 (TID 184)
20:14:07.514 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111889518754968235219.bam:0+237038
20:14:07.517 INFO Executor - Finished task 0.0 in stage 128.0 (TID 184). 989 bytes result sent to driver
20:14:07.518 INFO TaskSetManager - Finished task 0.0 in stage 128.0 (TID 184) in 15 ms on localhost (executor driver) (1/1)
20:14:07.518 INFO TaskSchedulerImpl - Removed TaskSet 128.0, whose tasks have all completed, from pool
20:14:07.518 INFO DAGScheduler - ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.023 s
20:14:07.518 INFO DAGScheduler - Job 90 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.518 INFO TaskSchedulerImpl - Killing all running tasks in stage 128: Stage finished
20:14:07.518 INFO DAGScheduler - Job 90 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.023127 s
20:14:07.520 INFO MemoryStore - Block broadcast_237 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:14:07.526 INFO MemoryStore - Block broadcast_237_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:07.526 INFO BlockManagerInfo - Added broadcast_237_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:07.527 INFO SparkContext - Created broadcast 237 from newAPIHadoopFile at PathSplitSource.java:96
20:14:07.548 INFO MemoryStore - Block broadcast_238 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
20:14:07.554 INFO MemoryStore - Block broadcast_238_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
20:14:07.554 INFO BlockManagerInfo - Added broadcast_238_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:07.554 INFO SparkContext - Created broadcast 238 from newAPIHadoopFile at PathSplitSource.java:96
20:14:07.574 INFO FileInputFormat - Total input files to process : 1
20:14:07.576 INFO MemoryStore - Block broadcast_239 stored as values in memory (estimated size 160.7 KiB, free 1915.9 MiB)
20:14:07.581 INFO MemoryStore - Block broadcast_239_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.9 MiB)
20:14:07.581 INFO BlockManagerInfo - Removed broadcast_229_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.3 MiB)
20:14:07.581 INFO BlockManagerInfo - Added broadcast_239_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.2 MiB)
20:14:07.581 INFO SparkContext - Created broadcast 239 from broadcast at ReadsSparkSink.java:133
20:14:07.582 INFO BlockManagerInfo - Removed broadcast_238_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:07.582 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:07.582 INFO BlockManagerInfo - Removed broadcast_235_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.4 MiB)
20:14:07.583 INFO BlockManagerInfo - Removed broadcast_230_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.6 MiB)
20:14:07.583 INFO MemoryStore - Block broadcast_240 stored as values in memory (estimated size 163.2 KiB, free 1917.4 MiB)
20:14:07.584 INFO BlockManagerInfo - Removed broadcast_236_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.7 MiB)
20:14:07.584 INFO MemoryStore - Block broadcast_240_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.6 MiB)
20:14:07.585 INFO BlockManagerInfo - Added broadcast_240_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:07.585 INFO SparkContext - Created broadcast 240 from broadcast at BamSink.java:76
20:14:07.585 INFO BlockManagerInfo - Removed broadcast_234_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.7 MiB)
20:14:07.585 INFO BlockManagerInfo - Removed broadcast_231_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.8 MiB)
20:14:07.586 INFO BlockManagerInfo - Removed broadcast_232_piece0 on localhost:35739 in memory (size: 8.3 KiB, free: 1919.8 MiB)
20:14:07.587 INFO BlockManagerInfo - Removed broadcast_233_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:07.587 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:07.587 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:07.587 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:07.587 INFO BlockManagerInfo - Removed broadcast_226_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:07.588 INFO BlockManagerInfo - Removed broadcast_227_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:07.588 INFO BlockManagerInfo - Removed broadcast_228_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.9 MiB)
20:14:07.606 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:07.606 INFO DAGScheduler - Registering RDD 593 (mapToPair at SparkUtils.java:161) as input to shuffle 27
20:14:07.606 INFO DAGScheduler - Got job 91 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:07.606 INFO DAGScheduler - Final stage: ResultStage 130 (runJob at SparkHadoopWriter.scala:83)
20:14:07.606 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 129)
20:14:07.606 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 129)
20:14:07.606 INFO DAGScheduler - Submitting ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:07.624 INFO MemoryStore - Block broadcast_241 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
20:14:07.625 INFO MemoryStore - Block broadcast_241_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
20:14:07.625 INFO BlockManagerInfo - Added broadcast_241_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.8 MiB)
20:14:07.626 INFO SparkContext - Created broadcast 241 from broadcast at DAGScheduler.scala:1580
20:14:07.626 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:07.626 INFO TaskSchedulerImpl - Adding task set 129.0 with 1 tasks resource profile 0
20:14:07.626 INFO TaskSetManager - Starting task 0.0 in stage 129.0 (TID 185) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:07.627 INFO Executor - Running task 0.0 in stage 129.0 (TID 185)
20:14:07.657 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:07.672 INFO Executor - Finished task 0.0 in stage 129.0 (TID 185). 1148 bytes result sent to driver
20:14:07.672 INFO TaskSetManager - Finished task 0.0 in stage 129.0 (TID 185) in 46 ms on localhost (executor driver) (1/1)
20:14:07.672 INFO TaskSchedulerImpl - Removed TaskSet 129.0, whose tasks have all completed, from pool
20:14:07.672 INFO DAGScheduler - ShuffleMapStage 129 (mapToPair at SparkUtils.java:161) finished in 0.065 s
20:14:07.672 INFO DAGScheduler - looking for newly runnable stages
20:14:07.672 INFO DAGScheduler - running: HashSet()
20:14:07.672 INFO DAGScheduler - waiting: HashSet(ResultStage 130)
20:14:07.672 INFO DAGScheduler - failed: HashSet()
20:14:07.672 INFO DAGScheduler - Submitting ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91), which has no missing parents
20:14:07.679 INFO MemoryStore - Block broadcast_242 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
20:14:07.680 INFO MemoryStore - Block broadcast_242_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
20:14:07.680 INFO BlockManagerInfo - Added broadcast_242_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.7 MiB)
20:14:07.680 INFO SparkContext - Created broadcast 242 from broadcast at DAGScheduler.scala:1580
20:14:07.680 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:07.680 INFO TaskSchedulerImpl - Adding task set 130.0 with 1 tasks resource profile 0
20:14:07.681 INFO TaskSetManager - Starting task 0.0 in stage 130.0 (TID 186) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:07.681 INFO Executor - Running task 0.0 in stage 130.0 (TID 186)
20:14:07.688 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:07.688 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:07.703 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:07.703 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:07.703 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:07.703 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:07.703 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:07.703 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:07.723 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014075059251852674787148_0598_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest17545161943397643970.bam.parts/_temporary/0/task_202502102014075059251852674787148_0598_r_000000
20:14:07.723 INFO SparkHadoopMapRedUtil - attempt_202502102014075059251852674787148_0598_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:07.723 INFO Executor - Finished task 0.0 in stage 130.0 (TID 186). 1858 bytes result sent to driver
20:14:07.724 INFO TaskSetManager - Finished task 0.0 in stage 130.0 (TID 186) in 43 ms on localhost (executor driver) (1/1)
20:14:07.724 INFO TaskSchedulerImpl - Removed TaskSet 130.0, whose tasks have all completed, from pool
20:14:07.724 INFO DAGScheduler - ResultStage 130 (runJob at SparkHadoopWriter.scala:83) finished in 0.051 s
20:14:07.724 INFO DAGScheduler - Job 91 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.724 INFO TaskSchedulerImpl - Killing all running tasks in stage 130: Stage finished
20:14:07.724 INFO DAGScheduler - Job 91 finished: runJob at SparkHadoopWriter.scala:83, took 0.118752 s
20:14:07.725 INFO SparkHadoopWriter - Start to commit write Job job_202502102014075059251852674787148_0598.
20:14:07.729 INFO SparkHadoopWriter - Write Job job_202502102014075059251852674787148_0598 committed. Elapsed time: 4 ms.
20:14:07.741 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest17545161943397643970.bam
20:14:07.745 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest17545161943397643970.bam done
20:14:07.745 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest17545161943397643970.bam.parts/ to /tmp/ReadsSparkSinkUnitTest17545161943397643970.bam.bai
20:14:07.751 INFO IndexFileMerger - Done merging .bai files
20:14:07.754 INFO MemoryStore - Block broadcast_243 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:07.760 INFO MemoryStore - Block broadcast_243_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:07.760 INFO BlockManagerInfo - Added broadcast_243_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:07.760 INFO SparkContext - Created broadcast 243 from newAPIHadoopFile at PathSplitSource.java:96
20:14:07.780 INFO FileInputFormat - Total input files to process : 1
20:14:07.815 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:07.815 INFO DAGScheduler - Got job 92 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:07.815 INFO DAGScheduler - Final stage: ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:07.815 INFO DAGScheduler - Parents of final stage: List()
20:14:07.815 INFO DAGScheduler - Missing parents: List()
20:14:07.816 INFO DAGScheduler - Submitting ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.832 INFO MemoryStore - Block broadcast_244 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
20:14:07.833 INFO MemoryStore - Block broadcast_244_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
20:14:07.834 INFO BlockManagerInfo - Added broadcast_244_piece0 in memory on localhost:35739 (size: 153.7 KiB, free: 1919.5 MiB)
20:14:07.834 INFO SparkContext - Created broadcast 244 from broadcast at DAGScheduler.scala:1580
20:14:07.834 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.834 INFO TaskSchedulerImpl - Adding task set 131.0 with 1 tasks resource profile 0
20:14:07.835 INFO TaskSetManager - Starting task 0.0 in stage 131.0 (TID 187) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:07.835 INFO Executor - Running task 0.0 in stage 131.0 (TID 187)
20:14:07.878 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17545161943397643970.bam:0+237038
20:14:07.891 INFO Executor - Finished task 0.0 in stage 131.0 (TID 187). 651483 bytes result sent to driver
20:14:07.892 INFO TaskSetManager - Finished task 0.0 in stage 131.0 (TID 187) in 58 ms on localhost (executor driver) (1/1)
20:14:07.892 INFO TaskSchedulerImpl - Removed TaskSet 131.0, whose tasks have all completed, from pool
20:14:07.892 INFO DAGScheduler - ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.076 s
20:14:07.893 INFO DAGScheduler - Job 92 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.893 INFO TaskSchedulerImpl - Killing all running tasks in stage 131: Stage finished
20:14:07.893 INFO DAGScheduler - Job 92 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.077630 s
20:14:07.907 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:07.908 INFO DAGScheduler - Got job 93 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:07.908 INFO DAGScheduler - Final stage: ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185)
20:14:07.908 INFO DAGScheduler - Parents of final stage: List()
20:14:07.908 INFO DAGScheduler - Missing parents: List()
20:14:07.908 INFO DAGScheduler - Submitting ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.925 INFO MemoryStore - Block broadcast_245 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
20:14:07.926 INFO MemoryStore - Block broadcast_245_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
20:14:07.926 INFO BlockManagerInfo - Added broadcast_245_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.4 MiB)
20:14:07.926 INFO SparkContext - Created broadcast 245 from broadcast at DAGScheduler.scala:1580
20:14:07.926 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.926 INFO TaskSchedulerImpl - Adding task set 132.0 with 1 tasks resource profile 0
20:14:07.927 INFO TaskSetManager - Starting task 0.0 in stage 132.0 (TID 188) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:07.927 INFO Executor - Running task 0.0 in stage 132.0 (TID 188)
20:14:07.956 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:07.965 INFO Executor - Finished task 0.0 in stage 132.0 (TID 188). 989 bytes result sent to driver
20:14:07.966 INFO TaskSetManager - Finished task 0.0 in stage 132.0 (TID 188) in 39 ms on localhost (executor driver) (1/1)
20:14:07.966 INFO TaskSchedulerImpl - Removed TaskSet 132.0, whose tasks have all completed, from pool
20:14:07.966 INFO DAGScheduler - ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
20:14:07.966 INFO DAGScheduler - Job 93 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:07.966 INFO TaskSchedulerImpl - Killing all running tasks in stage 132: Stage finished
20:14:07.966 INFO DAGScheduler - Job 93 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058441 s
20:14:07.969 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:07.969 INFO DAGScheduler - Got job 94 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:07.969 INFO DAGScheduler - Final stage: ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185)
20:14:07.969 INFO DAGScheduler - Parents of final stage: List()
20:14:07.969 INFO DAGScheduler - Missing parents: List()
20:14:07.970 INFO DAGScheduler - Submitting ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:07.989 INFO MemoryStore - Block broadcast_246 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
20:14:07.991 INFO MemoryStore - Block broadcast_246_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
20:14:07.991 INFO BlockManagerInfo - Added broadcast_246_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.2 MiB)
20:14:07.991 INFO SparkContext - Created broadcast 246 from broadcast at DAGScheduler.scala:1580
20:14:07.991 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:07.991 INFO TaskSchedulerImpl - Adding task set 133.0 with 1 tasks resource profile 0
20:14:07.992 INFO TaskSetManager - Starting task 0.0 in stage 133.0 (TID 189) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:07.992 INFO Executor - Running task 0.0 in stage 133.0 (TID 189)
20:14:08.020 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17545161943397643970.bam:0+237038
20:14:08.031 INFO Executor - Finished task 0.0 in stage 133.0 (TID 189). 989 bytes result sent to driver
20:14:08.031 INFO TaskSetManager - Finished task 0.0 in stage 133.0 (TID 189) in 39 ms on localhost (executor driver) (1/1)
20:14:08.032 INFO TaskSchedulerImpl - Removed TaskSet 133.0, whose tasks have all completed, from pool
20:14:08.032 INFO DAGScheduler - ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.062 s
20:14:08.032 INFO DAGScheduler - Job 94 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.032 INFO TaskSchedulerImpl - Killing all running tasks in stage 133: Stage finished
20:14:08.032 INFO DAGScheduler - Job 94 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062683 s
20:14:08.035 INFO MemoryStore - Block broadcast_247 stored as values in memory (estimated size 297.9 KiB, free 1916.0 MiB)
20:14:08.041 INFO MemoryStore - Block broadcast_247_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
20:14:08.041 INFO BlockManagerInfo - Added broadcast_247_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:08.041 INFO SparkContext - Created broadcast 247 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.062 INFO MemoryStore - Block broadcast_248 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
20:14:08.068 INFO MemoryStore - Block broadcast_248_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
20:14:08.068 INFO BlockManagerInfo - Added broadcast_248_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:14:08.068 INFO SparkContext - Created broadcast 248 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.088 INFO FileInputFormat - Total input files to process : 1
20:14:08.090 INFO MemoryStore - Block broadcast_249 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
20:14:08.090 INFO MemoryStore - Block broadcast_249_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
20:14:08.090 INFO BlockManagerInfo - Added broadcast_249_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.1 MiB)
20:14:08.091 INFO SparkContext - Created broadcast 249 from broadcast at ReadsSparkSink.java:133
20:14:08.091 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:08.092 INFO MemoryStore - Block broadcast_250 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
20:14:08.092 INFO MemoryStore - Block broadcast_250_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
20:14:08.093 INFO BlockManagerInfo - Added broadcast_250_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.1 MiB)
20:14:08.093 INFO SparkContext - Created broadcast 250 from broadcast at BamSink.java:76
20:14:08.094 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:08.094 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:08.094 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:08.111 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:08.111 INFO DAGScheduler - Registering RDD 619 (mapToPair at SparkUtils.java:161) as input to shuffle 28
20:14:08.112 INFO DAGScheduler - Got job 95 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:08.112 INFO DAGScheduler - Final stage: ResultStage 135 (runJob at SparkHadoopWriter.scala:83)
20:14:08.112 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 134)
20:14:08.112 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 134)
20:14:08.112 INFO DAGScheduler - Submitting ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:08.139 INFO MemoryStore - Block broadcast_251 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
20:14:08.140 INFO MemoryStore - Block broadcast_251_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
20:14:08.141 INFO BlockManagerInfo - Added broadcast_251_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1918.9 MiB)
20:14:08.141 INFO SparkContext - Created broadcast 251 from broadcast at DAGScheduler.scala:1580
20:14:08.141 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:08.141 INFO TaskSchedulerImpl - Adding task set 134.0 with 1 tasks resource profile 0
20:14:08.142 INFO TaskSetManager - Starting task 0.0 in stage 134.0 (TID 190) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:08.142 INFO Executor - Running task 0.0 in stage 134.0 (TID 190)
20:14:08.175 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:08.187 INFO BlockManagerInfo - Removed broadcast_241_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.1 MiB)
20:14:08.188 INFO BlockManagerInfo - Removed broadcast_248_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:14:08.188 INFO BlockManagerInfo - Removed broadcast_243_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.2 MiB)
20:14:08.189 INFO BlockManagerInfo - Removed broadcast_246_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.3 MiB)
20:14:08.189 INFO BlockManagerInfo - Removed broadcast_244_piece0 on localhost:35739 in memory (size: 153.7 KiB, free: 1919.5 MiB)
20:14:08.190 INFO BlockManagerInfo - Removed broadcast_242_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.6 MiB)
20:14:08.190 INFO BlockManagerInfo - Removed broadcast_245_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:08.192 INFO BlockManagerInfo - Removed broadcast_239_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:14:08.192 INFO BlockManagerInfo - Removed broadcast_237_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:08.193 INFO BlockManagerInfo - Removed broadcast_240_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:08.197 INFO Executor - Finished task 0.0 in stage 134.0 (TID 190). 1191 bytes result sent to driver
20:14:08.198 INFO TaskSetManager - Finished task 0.0 in stage 134.0 (TID 190) in 57 ms on localhost (executor driver) (1/1)
20:14:08.198 INFO TaskSchedulerImpl - Removed TaskSet 134.0, whose tasks have all completed, from pool
20:14:08.198 INFO DAGScheduler - ShuffleMapStage 134 (mapToPair at SparkUtils.java:161) finished in 0.086 s
20:14:08.198 INFO DAGScheduler - looking for newly runnable stages
20:14:08.198 INFO DAGScheduler - running: HashSet()
20:14:08.198 INFO DAGScheduler - waiting: HashSet(ResultStage 135)
20:14:08.198 INFO DAGScheduler - failed: HashSet()
20:14:08.198 INFO DAGScheduler - Submitting ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91), which has no missing parents
20:14:08.205 INFO MemoryStore - Block broadcast_252 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
20:14:08.205 INFO MemoryStore - Block broadcast_252_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
20:14:08.205 INFO BlockManagerInfo - Added broadcast_252_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.7 MiB)
20:14:08.206 INFO SparkContext - Created broadcast 252 from broadcast at DAGScheduler.scala:1580
20:14:08.206 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:08.206 INFO TaskSchedulerImpl - Adding task set 135.0 with 1 tasks resource profile 0
20:14:08.206 INFO TaskSetManager - Starting task 0.0 in stage 135.0 (TID 191) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:08.207 INFO Executor - Running task 0.0 in stage 135.0 (TID 191)
20:14:08.212 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:08.212 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:08.223 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:08.223 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:08.223 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:08.224 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:08.224 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:08.224 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:08.242 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014083615158304908006616_0624_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest117372232283063189736.bam.parts/_temporary/0/task_202502102014083615158304908006616_0624_r_000000
20:14:08.242 INFO SparkHadoopMapRedUtil - attempt_202502102014083615158304908006616_0624_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:08.243 INFO Executor - Finished task 0.0 in stage 135.0 (TID 191). 1858 bytes result sent to driver
20:14:08.243 INFO TaskSetManager - Finished task 0.0 in stage 135.0 (TID 191) in 37 ms on localhost (executor driver) (1/1)
20:14:08.243 INFO TaskSchedulerImpl - Removed TaskSet 135.0, whose tasks have all completed, from pool
20:14:08.244 INFO DAGScheduler - ResultStage 135 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
20:14:08.244 INFO DAGScheduler - Job 95 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.244 INFO TaskSchedulerImpl - Killing all running tasks in stage 135: Stage finished
20:14:08.244 INFO DAGScheduler - Job 95 finished: runJob at SparkHadoopWriter.scala:83, took 0.132917 s
20:14:08.244 INFO SparkHadoopWriter - Start to commit write Job job_202502102014083615158304908006616_0624.
20:14:08.249 INFO SparkHadoopWriter - Write Job job_202502102014083615158304908006616_0624 committed. Elapsed time: 4 ms.
20:14:08.261 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest117372232283063189736.bam
20:14:08.266 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest117372232283063189736.bam done
20:14:08.266 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest117372232283063189736.bam.parts/ to /tmp/ReadsSparkSinkUnitTest117372232283063189736.bam.sbi
20:14:08.271 INFO IndexFileMerger - Done merging .sbi files
20:14:08.272 INFO MemoryStore - Block broadcast_253 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
20:14:08.273 INFO MemoryStore - Block broadcast_253_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
20:14:08.273 INFO BlockManagerInfo - Added broadcast_253_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.7 MiB)
20:14:08.273 INFO SparkContext - Created broadcast 253 from broadcast at BamSource.java:104
20:14:08.274 INFO MemoryStore - Block broadcast_254 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:08.282 INFO MemoryStore - Block broadcast_254_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:08.282 INFO BlockManagerInfo - Added broadcast_254_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:08.282 INFO SparkContext - Created broadcast 254 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.293 INFO FileInputFormat - Total input files to process : 1
20:14:08.315 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:08.315 INFO DAGScheduler - Got job 96 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:08.315 INFO DAGScheduler - Final stage: ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:08.315 INFO DAGScheduler - Parents of final stage: List()
20:14:08.315 INFO DAGScheduler - Missing parents: List()
20:14:08.315 INFO DAGScheduler - Submitting ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:08.321 INFO MemoryStore - Block broadcast_255 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
20:14:08.322 INFO MemoryStore - Block broadcast_255_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
20:14:08.322 INFO BlockManagerInfo - Added broadcast_255_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:08.322 INFO SparkContext - Created broadcast 255 from broadcast at DAGScheduler.scala:1580
20:14:08.322 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:08.322 INFO TaskSchedulerImpl - Adding task set 136.0 with 1 tasks resource profile 0
20:14:08.323 INFO TaskSetManager - Starting task 0.0 in stage 136.0 (TID 192) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
20:14:08.323 INFO Executor - Running task 0.0 in stage 136.0 (TID 192)
20:14:08.335 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117372232283063189736.bam:0+237038
20:14:08.339 INFO Executor - Finished task 0.0 in stage 136.0 (TID 192). 651483 bytes result sent to driver
20:14:08.341 INFO TaskSetManager - Finished task 0.0 in stage 136.0 (TID 192) in 18 ms on localhost (executor driver) (1/1)
20:14:08.341 INFO TaskSchedulerImpl - Removed TaskSet 136.0, whose tasks have all completed, from pool
20:14:08.341 INFO DAGScheduler - ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.026 s
20:14:08.341 INFO DAGScheduler - Job 96 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.341 INFO TaskSchedulerImpl - Killing all running tasks in stage 136: Stage finished
20:14:08.341 INFO DAGScheduler - Job 96 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026392 s
20:14:08.350 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:08.350 INFO DAGScheduler - Got job 97 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:08.350 INFO DAGScheduler - Final stage: ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185)
20:14:08.350 INFO DAGScheduler - Parents of final stage: List()
20:14:08.350 INFO DAGScheduler - Missing parents: List()
20:14:08.351 INFO DAGScheduler - Submitting ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:08.367 INFO MemoryStore - Block broadcast_256 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:14:08.368 INFO MemoryStore - Block broadcast_256_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
20:14:08.369 INFO BlockManagerInfo - Added broadcast_256_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:08.369 INFO SparkContext - Created broadcast 256 from broadcast at DAGScheduler.scala:1580
20:14:08.369 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:08.369 INFO TaskSchedulerImpl - Adding task set 137.0 with 1 tasks resource profile 0
20:14:08.369 INFO TaskSetManager - Starting task 0.0 in stage 137.0 (TID 193) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:08.370 INFO Executor - Running task 0.0 in stage 137.0 (TID 193)
20:14:08.399 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:08.408 INFO Executor - Finished task 0.0 in stage 137.0 (TID 193). 989 bytes result sent to driver
20:14:08.408 INFO TaskSetManager - Finished task 0.0 in stage 137.0 (TID 193) in 39 ms on localhost (executor driver) (1/1)
20:14:08.409 INFO TaskSchedulerImpl - Removed TaskSet 137.0, whose tasks have all completed, from pool
20:14:08.409 INFO DAGScheduler - ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
20:14:08.409 INFO DAGScheduler - Job 97 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.409 INFO TaskSchedulerImpl - Killing all running tasks in stage 137: Stage finished
20:14:08.409 INFO DAGScheduler - Job 97 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058730 s
20:14:08.412 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:08.412 INFO DAGScheduler - Got job 98 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:08.412 INFO DAGScheduler - Final stage: ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185)
20:14:08.412 INFO DAGScheduler - Parents of final stage: List()
20:14:08.412 INFO DAGScheduler - Missing parents: List()
20:14:08.413 INFO DAGScheduler - Submitting ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:08.419 INFO MemoryStore - Block broadcast_257 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
20:14:08.419 INFO MemoryStore - Block broadcast_257_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
20:14:08.419 INFO BlockManagerInfo - Added broadcast_257_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.4 MiB)
20:14:08.420 INFO SparkContext - Created broadcast 257 from broadcast at DAGScheduler.scala:1580
20:14:08.420 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:08.420 INFO TaskSchedulerImpl - Adding task set 138.0 with 1 tasks resource profile 0
20:14:08.420 INFO TaskSetManager - Starting task 0.0 in stage 138.0 (TID 194) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
20:14:08.421 INFO Executor - Running task 0.0 in stage 138.0 (TID 194)
20:14:08.432 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117372232283063189736.bam:0+237038
20:14:08.435 INFO Executor - Finished task 0.0 in stage 138.0 (TID 194). 989 bytes result sent to driver
20:14:08.435 INFO TaskSetManager - Finished task 0.0 in stage 138.0 (TID 194) in 15 ms on localhost (executor driver) (1/1)
20:14:08.435 INFO TaskSchedulerImpl - Removed TaskSet 138.0, whose tasks have all completed, from pool
20:14:08.435 INFO DAGScheduler - ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
20:14:08.436 INFO DAGScheduler - Job 98 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.436 INFO TaskSchedulerImpl - Killing all running tasks in stage 138: Stage finished
20:14:08.436 INFO DAGScheduler - Job 98 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.023403 s
20:14:08.438 INFO MemoryStore - Block broadcast_258 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
20:14:08.445 INFO MemoryStore - Block broadcast_258_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:14:08.445 INFO BlockManagerInfo - Added broadcast_258_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:08.445 INFO SparkContext - Created broadcast 258 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.466 INFO MemoryStore - Block broadcast_259 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:14:08.472 INFO MemoryStore - Block broadcast_259_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:08.472 INFO BlockManagerInfo - Added broadcast_259_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:08.472 INFO SparkContext - Created broadcast 259 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.492 INFO FileInputFormat - Total input files to process : 1
20:14:08.493 INFO MemoryStore - Block broadcast_260 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
20:14:08.494 INFO MemoryStore - Block broadcast_260_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
20:14:08.494 INFO BlockManagerInfo - Added broadcast_260_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:08.494 INFO SparkContext - Created broadcast 260 from broadcast at ReadsSparkSink.java:133
20:14:08.495 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:08.495 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:08.496 INFO MemoryStore - Block broadcast_261 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
20:14:08.496 INFO MemoryStore - Block broadcast_261_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
20:14:08.496 INFO BlockManagerInfo - Added broadcast_261_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:08.497 INFO SparkContext - Created broadcast 261 from broadcast at BamSink.java:76
20:14:08.498 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:08.498 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:08.498 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:08.515 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:08.515 INFO DAGScheduler - Registering RDD 644 (mapToPair at SparkUtils.java:161) as input to shuffle 29
20:14:08.515 INFO DAGScheduler - Got job 99 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:08.515 INFO DAGScheduler - Final stage: ResultStage 140 (runJob at SparkHadoopWriter.scala:83)
20:14:08.515 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 139)
20:14:08.515 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 139)
20:14:08.516 INFO DAGScheduler - Submitting ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:08.533 INFO MemoryStore - Block broadcast_262 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
20:14:08.534 INFO MemoryStore - Block broadcast_262_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
20:14:08.534 INFO BlockManagerInfo - Added broadcast_262_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.1 MiB)
20:14:08.535 INFO SparkContext - Created broadcast 262 from broadcast at DAGScheduler.scala:1580
20:14:08.535 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:08.535 INFO TaskSchedulerImpl - Adding task set 139.0 with 1 tasks resource profile 0
20:14:08.535 INFO TaskSetManager - Starting task 0.0 in stage 139.0 (TID 195) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:08.536 INFO Executor - Running task 0.0 in stage 139.0 (TID 195)
20:14:08.565 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:08.580 INFO Executor - Finished task 0.0 in stage 139.0 (TID 195). 1148 bytes result sent to driver
20:14:08.580 INFO TaskSetManager - Finished task 0.0 in stage 139.0 (TID 195) in 45 ms on localhost (executor driver) (1/1)
20:14:08.580 INFO TaskSchedulerImpl - Removed TaskSet 139.0, whose tasks have all completed, from pool
20:14:08.580 INFO DAGScheduler - ShuffleMapStage 139 (mapToPair at SparkUtils.java:161) finished in 0.064 s
20:14:08.580 INFO DAGScheduler - looking for newly runnable stages
20:14:08.580 INFO DAGScheduler - running: HashSet()
20:14:08.580 INFO DAGScheduler - waiting: HashSet(ResultStage 140)
20:14:08.580 INFO DAGScheduler - failed: HashSet()
20:14:08.580 INFO DAGScheduler - Submitting ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91), which has no missing parents
20:14:08.587 INFO MemoryStore - Block broadcast_263 stored as values in memory (estimated size 241.4 KiB, free 1915.1 MiB)
20:14:08.588 INFO MemoryStore - Block broadcast_263_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.1 MiB)
20:14:08.588 INFO BlockManagerInfo - Added broadcast_263_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.1 MiB)
20:14:08.588 INFO SparkContext - Created broadcast 263 from broadcast at DAGScheduler.scala:1580
20:14:08.588 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:08.588 INFO TaskSchedulerImpl - Adding task set 140.0 with 1 tasks resource profile 0
20:14:08.589 INFO TaskSetManager - Starting task 0.0 in stage 140.0 (TID 196) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:08.589 INFO Executor - Running task 0.0 in stage 140.0 (TID 196)
20:14:08.595 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:08.595 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:08.612 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:08.612 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:08.612 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:08.612 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:08.612 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:08.612 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:08.628 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014086913510682836042398_0649_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest17605949503825917835.bam.parts/_temporary/0/task_202502102014086913510682836042398_0649_r_000000
20:14:08.628 INFO SparkHadoopMapRedUtil - attempt_202502102014086913510682836042398_0649_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:08.629 INFO Executor - Finished task 0.0 in stage 140.0 (TID 196). 1858 bytes result sent to driver
20:14:08.629 INFO TaskSetManager - Finished task 0.0 in stage 140.0 (TID 196) in 40 ms on localhost (executor driver) (1/1)
20:14:08.629 INFO TaskSchedulerImpl - Removed TaskSet 140.0, whose tasks have all completed, from pool
20:14:08.630 INFO DAGScheduler - ResultStage 140 (runJob at SparkHadoopWriter.scala:83) finished in 0.048 s
20:14:08.630 INFO DAGScheduler - Job 99 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.630 INFO TaskSchedulerImpl - Killing all running tasks in stage 140: Stage finished
20:14:08.630 INFO DAGScheduler - Job 99 finished: runJob at SparkHadoopWriter.scala:83, took 0.115177 s
20:14:08.630 INFO SparkHadoopWriter - Start to commit write Job job_202502102014086913510682836042398_0649.
20:14:08.635 INFO SparkHadoopWriter - Write Job job_202502102014086913510682836042398_0649 committed. Elapsed time: 4 ms.
20:14:08.657 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest17605949503825917835.bam
20:14:08.662 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest17605949503825917835.bam done
20:14:08.665 INFO MemoryStore - Block broadcast_264 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
20:14:08.673 INFO BlockManagerInfo - Removed broadcast_253_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.1 MiB)
20:14:08.674 INFO BlockManagerInfo - Removed broadcast_249_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.1 MiB)
20:14:08.674 INFO BlockManagerInfo - Removed broadcast_254_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:14:08.675 INFO BlockManagerInfo - Removed broadcast_252_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.2 MiB)
20:14:08.675 INFO BlockManagerInfo - Removed broadcast_251_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.3 MiB)
20:14:08.676 INFO BlockManagerInfo - Removed broadcast_255_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.4 MiB)
20:14:08.677 INFO BlockManagerInfo - Removed broadcast_262_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.6 MiB)
20:14:08.677 INFO BlockManagerInfo - Removed broadcast_259_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:08.678 INFO BlockManagerInfo - Removed broadcast_261_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.6 MiB)
20:14:08.679 INFO BlockManagerInfo - Removed broadcast_247_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:08.680 INFO BlockManagerInfo - Removed broadcast_256_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:14:08.680 INFO BlockManagerInfo - Removed broadcast_250_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:08.680 INFO MemoryStore - Block broadcast_264_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
20:14:08.680 INFO BlockManagerInfo - Added broadcast_264_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:14:08.681 INFO BlockManagerInfo - Removed broadcast_260_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:08.681 INFO SparkContext - Created broadcast 264 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.682 INFO BlockManagerInfo - Removed broadcast_263_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.8 MiB)
20:14:08.683 INFO BlockManagerInfo - Removed broadcast_257_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.9 MiB)
20:14:08.704 INFO FileInputFormat - Total input files to process : 1
20:14:08.739 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:08.739 INFO DAGScheduler - Got job 100 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:08.739 INFO DAGScheduler - Final stage: ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:08.739 INFO DAGScheduler - Parents of final stage: List()
20:14:08.739 INFO DAGScheduler - Missing parents: List()
20:14:08.739 INFO DAGScheduler - Submitting ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:08.756 INFO MemoryStore - Block broadcast_265 stored as values in memory (estimated size 426.2 KiB, free 1918.9 MiB)
20:14:08.757 INFO MemoryStore - Block broadcast_265_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.8 MiB)
20:14:08.757 INFO BlockManagerInfo - Added broadcast_265_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.8 MiB)
20:14:08.757 INFO SparkContext - Created broadcast 265 from broadcast at DAGScheduler.scala:1580
20:14:08.758 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:08.758 INFO TaskSchedulerImpl - Adding task set 141.0 with 1 tasks resource profile 0
20:14:08.758 INFO TaskSetManager - Starting task 0.0 in stage 141.0 (TID 197) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:08.759 INFO Executor - Running task 0.0 in stage 141.0 (TID 197)
20:14:08.788 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17605949503825917835.bam:0+237038
20:14:08.800 INFO Executor - Finished task 0.0 in stage 141.0 (TID 197). 651526 bytes result sent to driver
20:14:08.802 INFO TaskSetManager - Finished task 0.0 in stage 141.0 (TID 197) in 44 ms on localhost (executor driver) (1/1)
20:14:08.802 INFO TaskSchedulerImpl - Removed TaskSet 141.0, whose tasks have all completed, from pool
20:14:08.802 INFO DAGScheduler - ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.062 s
20:14:08.802 INFO DAGScheduler - Job 100 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.802 INFO TaskSchedulerImpl - Killing all running tasks in stage 141: Stage finished
20:14:08.802 INFO DAGScheduler - Job 100 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.063463 s
20:14:08.812 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:08.812 INFO DAGScheduler - Got job 101 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:08.812 INFO DAGScheduler - Final stage: ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185)
20:14:08.812 INFO DAGScheduler - Parents of final stage: List()
20:14:08.812 INFO DAGScheduler - Missing parents: List()
20:14:08.812 INFO DAGScheduler - Submitting ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:08.829 INFO MemoryStore - Block broadcast_266 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
20:14:08.830 INFO MemoryStore - Block broadcast_266_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
20:14:08.830 INFO BlockManagerInfo - Added broadcast_266_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.6 MiB)
20:14:08.830 INFO SparkContext - Created broadcast 266 from broadcast at DAGScheduler.scala:1580
20:14:08.830 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:08.830 INFO TaskSchedulerImpl - Adding task set 142.0 with 1 tasks resource profile 0
20:14:08.831 INFO TaskSetManager - Starting task 0.0 in stage 142.0 (TID 198) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:08.831 INFO Executor - Running task 0.0 in stage 142.0 (TID 198)
20:14:08.860 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:08.869 INFO Executor - Finished task 0.0 in stage 142.0 (TID 198). 989 bytes result sent to driver
20:14:08.869 INFO TaskSetManager - Finished task 0.0 in stage 142.0 (TID 198) in 38 ms on localhost (executor driver) (1/1)
20:14:08.869 INFO TaskSchedulerImpl - Removed TaskSet 142.0, whose tasks have all completed, from pool
20:14:08.869 INFO DAGScheduler - ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:08.870 INFO DAGScheduler - Job 101 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.870 INFO TaskSchedulerImpl - Killing all running tasks in stage 142: Stage finished
20:14:08.870 INFO DAGScheduler - Job 101 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058077 s
20:14:08.873 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:08.873 INFO DAGScheduler - Got job 102 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:08.873 INFO DAGScheduler - Final stage: ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185)
20:14:08.873 INFO DAGScheduler - Parents of final stage: List()
20:14:08.873 INFO DAGScheduler - Missing parents: List()
20:14:08.873 INFO DAGScheduler - Submitting ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:08.890 INFO MemoryStore - Block broadcast_267 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:14:08.891 INFO MemoryStore - Block broadcast_267_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:14:08.891 INFO BlockManagerInfo - Added broadcast_267_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:08.891 INFO SparkContext - Created broadcast 267 from broadcast at DAGScheduler.scala:1580
20:14:08.891 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:08.891 INFO TaskSchedulerImpl - Adding task set 143.0 with 1 tasks resource profile 0
20:14:08.892 INFO TaskSetManager - Starting task 0.0 in stage 143.0 (TID 199) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:08.892 INFO Executor - Running task 0.0 in stage 143.0 (TID 199)
20:14:08.920 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17605949503825917835.bam:0+237038
20:14:08.931 INFO Executor - Finished task 0.0 in stage 143.0 (TID 199). 989 bytes result sent to driver
20:14:08.931 INFO TaskSetManager - Finished task 0.0 in stage 143.0 (TID 199) in 39 ms on localhost (executor driver) (1/1)
20:14:08.931 INFO TaskSchedulerImpl - Removed TaskSet 143.0, whose tasks have all completed, from pool
20:14:08.931 INFO DAGScheduler - ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:08.931 INFO DAGScheduler - Job 102 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:08.931 INFO TaskSchedulerImpl - Killing all running tasks in stage 143: Stage finished
20:14:08.932 INFO DAGScheduler - Job 102 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058640 s
20:14:08.934 INFO MemoryStore - Block broadcast_268 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
20:14:08.940 INFO MemoryStore - Block broadcast_268_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
20:14:08.940 INFO BlockManagerInfo - Added broadcast_268_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:14:08.940 INFO SparkContext - Created broadcast 268 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.961 INFO MemoryStore - Block broadcast_269 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
20:14:08.967 INFO MemoryStore - Block broadcast_269_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.9 MiB)
20:14:08.967 INFO BlockManagerInfo - Added broadcast_269_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:14:08.968 INFO SparkContext - Created broadcast 269 from newAPIHadoopFile at PathSplitSource.java:96
20:14:08.987 INFO FileInputFormat - Total input files to process : 1
20:14:08.989 INFO MemoryStore - Block broadcast_270 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:14:08.990 INFO MemoryStore - Block broadcast_270_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:14:08.990 INFO BlockManagerInfo - Added broadcast_270_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:08.990 INFO SparkContext - Created broadcast 270 from broadcast at ReadsSparkSink.java:133
20:14:08.991 INFO MemoryStore - Block broadcast_271 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
20:14:08.992 INFO MemoryStore - Block broadcast_271_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
20:14:08.992 INFO BlockManagerInfo - Added broadcast_271_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:08.992 INFO SparkContext - Created broadcast 271 from broadcast at BamSink.java:76
20:14:08.994 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:08.994 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:08.994 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.011 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:09.012 INFO DAGScheduler - Registering RDD 670 (mapToPair at SparkUtils.java:161) as input to shuffle 30
20:14:09.012 INFO DAGScheduler - Got job 103 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:09.012 INFO DAGScheduler - Final stage: ResultStage 145 (runJob at SparkHadoopWriter.scala:83)
20:14:09.012 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 144)
20:14:09.012 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 144)
20:14:09.012 INFO DAGScheduler - Submitting ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:09.029 INFO MemoryStore - Block broadcast_272 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
20:14:09.030 INFO MemoryStore - Block broadcast_272_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
20:14:09.030 INFO BlockManagerInfo - Added broadcast_272_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.2 MiB)
20:14:09.031 INFO SparkContext - Created broadcast 272 from broadcast at DAGScheduler.scala:1580
20:14:09.031 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:09.031 INFO TaskSchedulerImpl - Adding task set 144.0 with 1 tasks resource profile 0
20:14:09.031 INFO TaskSetManager - Starting task 0.0 in stage 144.0 (TID 200) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
20:14:09.032 INFO Executor - Running task 0.0 in stage 144.0 (TID 200)
20:14:09.060 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:14:09.077 INFO Executor - Finished task 0.0 in stage 144.0 (TID 200). 1148 bytes result sent to driver
20:14:09.077 INFO TaskSetManager - Finished task 0.0 in stage 144.0 (TID 200) in 46 ms on localhost (executor driver) (1/1)
20:14:09.077 INFO TaskSchedulerImpl - Removed TaskSet 144.0, whose tasks have all completed, from pool
20:14:09.078 INFO DAGScheduler - ShuffleMapStage 144 (mapToPair at SparkUtils.java:161) finished in 0.066 s
20:14:09.078 INFO DAGScheduler - looking for newly runnable stages
20:14:09.078 INFO DAGScheduler - running: HashSet()
20:14:09.078 INFO DAGScheduler - waiting: HashSet(ResultStage 145)
20:14:09.078 INFO DAGScheduler - failed: HashSet()
20:14:09.078 INFO DAGScheduler - Submitting ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91), which has no missing parents
20:14:09.085 INFO MemoryStore - Block broadcast_273 stored as values in memory (estimated size 241.4 KiB, free 1915.7 MiB)
20:14:09.085 INFO MemoryStore - Block broadcast_273_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.6 MiB)
20:14:09.086 INFO BlockManagerInfo - Added broadcast_273_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.1 MiB)
20:14:09.086 INFO SparkContext - Created broadcast 273 from broadcast at DAGScheduler.scala:1580
20:14:09.086 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:09.086 INFO TaskSchedulerImpl - Adding task set 145.0 with 1 tasks resource profile 0
20:14:09.086 INFO TaskSetManager - Starting task 0.0 in stage 145.0 (TID 201) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:09.087 INFO Executor - Running task 0.0 in stage 145.0 (TID 201)
20:14:09.091 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:09.091 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:09.102 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.102 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.102 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.103 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.103 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.103 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.126 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014082426402724753384804_0675_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest24847160074689645499.bam.parts/_temporary/0/task_202502102014082426402724753384804_0675_r_000000
20:14:09.126 INFO SparkHadoopMapRedUtil - attempt_202502102014082426402724753384804_0675_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:09.127 INFO Executor - Finished task 0.0 in stage 145.0 (TID 201). 1858 bytes result sent to driver
20:14:09.127 INFO TaskSetManager - Finished task 0.0 in stage 145.0 (TID 201) in 41 ms on localhost (executor driver) (1/1)
20:14:09.127 INFO TaskSchedulerImpl - Removed TaskSet 145.0, whose tasks have all completed, from pool
20:14:09.127 INFO DAGScheduler - ResultStage 145 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
20:14:09.127 INFO DAGScheduler - Job 103 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.127 INFO TaskSchedulerImpl - Killing all running tasks in stage 145: Stage finished
20:14:09.127 INFO DAGScheduler - Job 103 finished: runJob at SparkHadoopWriter.scala:83, took 0.116077 s
20:14:09.128 INFO SparkHadoopWriter - Start to commit write Job job_202502102014082426402724753384804_0675.
20:14:09.132 INFO SparkHadoopWriter - Write Job job_202502102014082426402724753384804_0675 committed. Elapsed time: 4 ms.
20:14:09.144 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest24847160074689645499.bam
20:14:09.148 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest24847160074689645499.bam done
20:14:09.148 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest24847160074689645499.bam.parts/ to /tmp/ReadsSparkSinkUnitTest24847160074689645499.bam.sbi
20:14:09.153 INFO IndexFileMerger - Done merging .sbi files
20:14:09.153 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest24847160074689645499.bam.parts/ to /tmp/ReadsSparkSinkUnitTest24847160074689645499.bam.bai
20:14:09.158 INFO IndexFileMerger - Done merging .bai files
20:14:09.160 INFO MemoryStore - Block broadcast_274 stored as values in memory (estimated size 320.0 B, free 1915.6 MiB)
20:14:09.161 INFO MemoryStore - Block broadcast_274_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.6 MiB)
20:14:09.161 INFO BlockManagerInfo - Added broadcast_274_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.1 MiB)
20:14:09.161 INFO SparkContext - Created broadcast 274 from broadcast at BamSource.java:104
20:14:09.162 INFO MemoryStore - Block broadcast_275 stored as values in memory (estimated size 297.9 KiB, free 1915.3 MiB)
20:14:09.173 INFO MemoryStore - Block broadcast_275_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
20:14:09.173 INFO BlockManagerInfo - Added broadcast_275_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:14:09.173 INFO SparkContext - Created broadcast 275 from newAPIHadoopFile at PathSplitSource.java:96
20:14:09.183 INFO FileInputFormat - Total input files to process : 1
20:14:09.197 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:09.198 INFO DAGScheduler - Got job 104 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:09.198 INFO DAGScheduler - Final stage: ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:09.198 INFO DAGScheduler - Parents of final stage: List()
20:14:09.198 INFO DAGScheduler - Missing parents: List()
20:14:09.198 INFO DAGScheduler - Submitting ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:09.204 INFO MemoryStore - Block broadcast_276 stored as values in memory (estimated size 148.2 KiB, free 1915.1 MiB)
20:14:09.204 INFO MemoryStore - Block broadcast_276_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.1 MiB)
20:14:09.205 INFO BlockManagerInfo - Added broadcast_276_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.0 MiB)
20:14:09.205 INFO SparkContext - Created broadcast 276 from broadcast at DAGScheduler.scala:1580
20:14:09.205 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:09.205 INFO TaskSchedulerImpl - Adding task set 146.0 with 1 tasks resource profile 0
20:14:09.206 INFO TaskSetManager - Starting task 0.0 in stage 146.0 (TID 202) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:09.206 INFO Executor - Running task 0.0 in stage 146.0 (TID 202)
20:14:09.217 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest24847160074689645499.bam:0+235514
20:14:09.224 INFO BlockManagerInfo - Removed broadcast_264_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:14:09.225 INFO BlockManagerInfo - Removed broadcast_266_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.2 MiB)
20:14:09.225 INFO BlockManagerInfo - Removed broadcast_258_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:09.225 INFO Executor - Finished task 0.0 in stage 146.0 (TID 202). 650227 bytes result sent to driver
20:14:09.226 INFO BlockManagerInfo - Removed broadcast_265_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.4 MiB)
20:14:09.226 INFO BlockManagerInfo - Removed broadcast_267_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.6 MiB)
20:14:09.227 INFO BlockManagerInfo - Removed broadcast_270_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.6 MiB)
20:14:09.227 INFO BlockManagerInfo - Removed broadcast_271_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.6 MiB)
20:14:09.227 INFO BlockManagerInfo - Removed broadcast_273_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.6 MiB)
20:14:09.228 INFO TaskSetManager - Finished task 0.0 in stage 146.0 (TID 202) in 23 ms on localhost (executor driver) (1/1)
20:14:09.228 INFO TaskSchedulerImpl - Removed TaskSet 146.0, whose tasks have all completed, from pool
20:14:09.228 INFO DAGScheduler - ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.030 s
20:14:09.228 INFO BlockManagerInfo - Removed broadcast_269_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.7 MiB)
20:14:09.228 INFO DAGScheduler - Job 104 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.228 INFO TaskSchedulerImpl - Killing all running tasks in stage 146: Stage finished
20:14:09.228 INFO DAGScheduler - Job 104 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.030774 s
20:14:09.229 INFO BlockManagerInfo - Removed broadcast_272_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.8 MiB)
20:14:09.238 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:09.238 INFO DAGScheduler - Got job 105 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:09.238 INFO DAGScheduler - Final stage: ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185)
20:14:09.238 INFO DAGScheduler - Parents of final stage: List()
20:14:09.238 INFO DAGScheduler - Missing parents: List()
20:14:09.238 INFO DAGScheduler - Submitting ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:09.255 INFO MemoryStore - Block broadcast_277 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
20:14:09.256 INFO MemoryStore - Block broadcast_277_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
20:14:09.256 INFO BlockManagerInfo - Added broadcast_277_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:14:09.256 INFO SparkContext - Created broadcast 277 from broadcast at DAGScheduler.scala:1580
20:14:09.256 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:09.256 INFO TaskSchedulerImpl - Adding task set 147.0 with 1 tasks resource profile 0
20:14:09.257 INFO TaskSetManager - Starting task 0.0 in stage 147.0 (TID 203) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
20:14:09.257 INFO Executor - Running task 0.0 in stage 147.0 (TID 203)
20:14:09.285 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:14:09.296 INFO Executor - Finished task 0.0 in stage 147.0 (TID 203). 989 bytes result sent to driver
20:14:09.297 INFO TaskSetManager - Finished task 0.0 in stage 147.0 (TID 203) in 40 ms on localhost (executor driver) (1/1)
20:14:09.297 INFO TaskSchedulerImpl - Removed TaskSet 147.0, whose tasks have all completed, from pool
20:14:09.297 INFO DAGScheduler - ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
20:14:09.297 INFO DAGScheduler - Job 105 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.297 INFO TaskSchedulerImpl - Killing all running tasks in stage 147: Stage finished
20:14:09.297 INFO DAGScheduler - Job 105 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059521 s
20:14:09.301 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:09.301 INFO DAGScheduler - Got job 106 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:09.301 INFO DAGScheduler - Final stage: ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185)
20:14:09.301 INFO DAGScheduler - Parents of final stage: List()
20:14:09.301 INFO DAGScheduler - Missing parents: List()
20:14:09.301 INFO DAGScheduler - Submitting ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:09.307 INFO MemoryStore - Block broadcast_278 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
20:14:09.308 INFO MemoryStore - Block broadcast_278_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.4 MiB)
20:14:09.308 INFO BlockManagerInfo - Added broadcast_278_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.6 MiB)
20:14:09.308 INFO SparkContext - Created broadcast 278 from broadcast at DAGScheduler.scala:1580
20:14:09.308 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:09.308 INFO TaskSchedulerImpl - Adding task set 148.0 with 1 tasks resource profile 0
20:14:09.309 INFO TaskSetManager - Starting task 0.0 in stage 148.0 (TID 204) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:09.309 INFO Executor - Running task 0.0 in stage 148.0 (TID 204)
20:14:09.319 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest24847160074689645499.bam:0+235514
20:14:09.323 INFO Executor - Finished task 0.0 in stage 148.0 (TID 204). 989 bytes result sent to driver
20:14:09.323 INFO TaskSetManager - Finished task 0.0 in stage 148.0 (TID 204) in 14 ms on localhost (executor driver) (1/1)
20:14:09.323 INFO TaskSchedulerImpl - Removed TaskSet 148.0, whose tasks have all completed, from pool
20:14:09.323 INFO DAGScheduler - ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
20:14:09.323 INFO DAGScheduler - Job 106 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.323 INFO TaskSchedulerImpl - Killing all running tasks in stage 148: Stage finished
20:14:09.323 INFO DAGScheduler - Job 106 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022454 s
20:14:09.325 INFO MemoryStore - Block broadcast_279 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
20:14:09.331 INFO MemoryStore - Block broadcast_279_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:09.332 INFO BlockManagerInfo - Added broadcast_279_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:09.332 INFO SparkContext - Created broadcast 279 from newAPIHadoopFile at PathSplitSource.java:96
20:14:09.353 INFO MemoryStore - Block broadcast_280 stored as values in memory (estimated size 298.0 KiB, free 1917.7 MiB)
20:14:09.359 INFO MemoryStore - Block broadcast_280_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
20:14:09.359 INFO BlockManagerInfo - Added broadcast_280_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:09.360 INFO SparkContext - Created broadcast 280 from newAPIHadoopFile at PathSplitSource.java:96
20:14:09.379 INFO FileInputFormat - Total input files to process : 1
20:14:09.381 INFO MemoryStore - Block broadcast_281 stored as values in memory (estimated size 19.6 KiB, free 1917.7 MiB)
20:14:09.381 INFO MemoryStore - Block broadcast_281_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.7 MiB)
20:14:09.381 INFO BlockManagerInfo - Added broadcast_281_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.5 MiB)
20:14:09.381 INFO SparkContext - Created broadcast 281 from broadcast at ReadsSparkSink.java:133
20:14:09.382 INFO MemoryStore - Block broadcast_282 stored as values in memory (estimated size 20.0 KiB, free 1917.6 MiB)
20:14:09.382 INFO MemoryStore - Block broadcast_282_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.6 MiB)
20:14:09.382 INFO BlockManagerInfo - Added broadcast_282_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.5 MiB)
20:14:09.383 INFO SparkContext - Created broadcast 282 from broadcast at BamSink.java:76
20:14:09.384 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.384 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.384 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.401 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:09.402 INFO DAGScheduler - Registering RDD 695 (mapToPair at SparkUtils.java:161) as input to shuffle 31
20:14:09.402 INFO DAGScheduler - Got job 107 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:09.402 INFO DAGScheduler - Final stage: ResultStage 150 (runJob at SparkHadoopWriter.scala:83)
20:14:09.402 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 149)
20:14:09.402 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 149)
20:14:09.402 INFO DAGScheduler - Submitting ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:09.419 INFO MemoryStore - Block broadcast_283 stored as values in memory (estimated size 434.3 KiB, free 1917.2 MiB)
20:14:09.420 INFO MemoryStore - Block broadcast_283_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1917.1 MiB)
20:14:09.421 INFO BlockManagerInfo - Added broadcast_283_piece0 in memory on localhost:35739 (size: 157.6 KiB, free: 1919.4 MiB)
20:14:09.421 INFO SparkContext - Created broadcast 283 from broadcast at DAGScheduler.scala:1580
20:14:09.421 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:09.421 INFO TaskSchedulerImpl - Adding task set 149.0 with 1 tasks resource profile 0
20:14:09.421 INFO TaskSetManager - Starting task 0.0 in stage 149.0 (TID 205) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
20:14:09.422 INFO Executor - Running task 0.0 in stage 149.0 (TID 205)
20:14:09.450 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:09.463 INFO Executor - Finished task 0.0 in stage 149.0 (TID 205). 1148 bytes result sent to driver
20:14:09.463 INFO TaskSetManager - Finished task 0.0 in stage 149.0 (TID 205) in 42 ms on localhost (executor driver) (1/1)
20:14:09.463 INFO TaskSchedulerImpl - Removed TaskSet 149.0, whose tasks have all completed, from pool
20:14:09.463 INFO DAGScheduler - ShuffleMapStage 149 (mapToPair at SparkUtils.java:161) finished in 0.061 s
20:14:09.463 INFO DAGScheduler - looking for newly runnable stages
20:14:09.463 INFO DAGScheduler - running: HashSet()
20:14:09.463 INFO DAGScheduler - waiting: HashSet(ResultStage 150)
20:14:09.463 INFO DAGScheduler - failed: HashSet()
20:14:09.463 INFO DAGScheduler - Submitting ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91), which has no missing parents
20:14:09.470 INFO MemoryStore - Block broadcast_284 stored as values in memory (estimated size 155.3 KiB, free 1916.9 MiB)
20:14:09.471 INFO MemoryStore - Block broadcast_284_piece0 stored as bytes in memory (estimated size 58.4 KiB, free 1916.8 MiB)
20:14:09.471 INFO BlockManagerInfo - Added broadcast_284_piece0 in memory on localhost:35739 (size: 58.4 KiB, free: 1919.3 MiB)
20:14:09.471 INFO SparkContext - Created broadcast 284 from broadcast at DAGScheduler.scala:1580
20:14:09.471 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:09.471 INFO TaskSchedulerImpl - Adding task set 150.0 with 1 tasks resource profile 0
20:14:09.472 INFO TaskSetManager - Starting task 0.0 in stage 150.0 (TID 206) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:09.472 INFO Executor - Running task 0.0 in stage 150.0 (TID 206)
20:14:09.475 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:09.476 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:09.486 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.486 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.486 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.486 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.486 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.486 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.509 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014094687152016360249977_0700_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest35686897756114496513.bam.parts/_temporary/0/task_202502102014094687152016360249977_0700_r_000000
20:14:09.509 INFO SparkHadoopMapRedUtil - attempt_202502102014094687152016360249977_0700_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:09.510 INFO Executor - Finished task 0.0 in stage 150.0 (TID 206). 1858 bytes result sent to driver
20:14:09.510 INFO TaskSetManager - Finished task 0.0 in stage 150.0 (TID 206) in 38 ms on localhost (executor driver) (1/1)
20:14:09.510 INFO TaskSchedulerImpl - Removed TaskSet 150.0, whose tasks have all completed, from pool
20:14:09.510 INFO DAGScheduler - ResultStage 150 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
20:14:09.511 INFO DAGScheduler - Job 107 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.511 INFO TaskSchedulerImpl - Killing all running tasks in stage 150: Stage finished
20:14:09.511 INFO DAGScheduler - Job 107 finished: runJob at SparkHadoopWriter.scala:83, took 0.109337 s
20:14:09.511 INFO SparkHadoopWriter - Start to commit write Job job_202502102014094687152016360249977_0700.
20:14:09.515 INFO SparkHadoopWriter - Write Job job_202502102014094687152016360249977_0700 committed. Elapsed time: 4 ms.
20:14:09.526 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest35686897756114496513.bam
20:14:09.530 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest35686897756114496513.bam done
20:14:09.530 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest35686897756114496513.bam.parts/ to /tmp/ReadsSparkSinkUnitTest35686897756114496513.bam.sbi
20:14:09.535 INFO IndexFileMerger - Done merging .sbi files
20:14:09.535 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest35686897756114496513.bam.parts/ to /tmp/ReadsSparkSinkUnitTest35686897756114496513.bam.bai
20:14:09.539 INFO IndexFileMerger - Done merging .bai files
20:14:09.540 INFO MemoryStore - Block broadcast_285 stored as values in memory (estimated size 312.0 B, free 1916.8 MiB)
20:14:09.541 INFO MemoryStore - Block broadcast_285_piece0 stored as bytes in memory (estimated size 231.0 B, free 1916.8 MiB)
20:14:09.541 INFO BlockManagerInfo - Added broadcast_285_piece0 in memory on localhost:35739 (size: 231.0 B, free: 1919.3 MiB)
20:14:09.541 INFO SparkContext - Created broadcast 285 from broadcast at BamSource.java:104
20:14:09.542 INFO MemoryStore - Block broadcast_286 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
20:14:09.548 INFO MemoryStore - Block broadcast_286_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.5 MiB)
20:14:09.548 INFO BlockManagerInfo - Added broadcast_286_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:09.548 INFO SparkContext - Created broadcast 286 from newAPIHadoopFile at PathSplitSource.java:96
20:14:09.557 INFO FileInputFormat - Total input files to process : 1
20:14:09.571 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:09.571 INFO DAGScheduler - Got job 108 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:09.571 INFO DAGScheduler - Final stage: ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:09.571 INFO DAGScheduler - Parents of final stage: List()
20:14:09.571 INFO DAGScheduler - Missing parents: List()
20:14:09.571 INFO DAGScheduler - Submitting ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:09.578 INFO MemoryStore - Block broadcast_287 stored as values in memory (estimated size 148.2 KiB, free 1916.4 MiB)
20:14:09.578 INFO MemoryStore - Block broadcast_287_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.3 MiB)
20:14:09.578 INFO BlockManagerInfo - Added broadcast_287_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.2 MiB)
20:14:09.579 INFO SparkContext - Created broadcast 287 from broadcast at DAGScheduler.scala:1580
20:14:09.579 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:09.579 INFO TaskSchedulerImpl - Adding task set 151.0 with 1 tasks resource profile 0
20:14:09.579 INFO TaskSetManager - Starting task 0.0 in stage 151.0 (TID 207) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:09.579 INFO Executor - Running task 0.0 in stage 151.0 (TID 207)
20:14:09.590 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest35686897756114496513.bam:0+236517
20:14:09.594 INFO Executor - Finished task 0.0 in stage 151.0 (TID 207). 749470 bytes result sent to driver
20:14:09.595 INFO TaskSetManager - Finished task 0.0 in stage 151.0 (TID 207) in 16 ms on localhost (executor driver) (1/1)
20:14:09.595 INFO TaskSchedulerImpl - Removed TaskSet 151.0, whose tasks have all completed, from pool
20:14:09.596 INFO DAGScheduler - ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.023 s
20:14:09.596 INFO DAGScheduler - Job 108 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.596 INFO TaskSchedulerImpl - Killing all running tasks in stage 151: Stage finished
20:14:09.596 INFO DAGScheduler - Job 108 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.024724 s
20:14:09.607 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:09.607 INFO DAGScheduler - Got job 109 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:09.607 INFO DAGScheduler - Final stage: ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185)
20:14:09.607 INFO DAGScheduler - Parents of final stage: List()
20:14:09.607 INFO DAGScheduler - Missing parents: List()
20:14:09.607 INFO DAGScheduler - Submitting ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:09.624 INFO MemoryStore - Block broadcast_288 stored as values in memory (estimated size 426.1 KiB, free 1915.9 MiB)
20:14:09.625 INFO MemoryStore - Block broadcast_288_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
20:14:09.625 INFO BlockManagerInfo - Added broadcast_288_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.1 MiB)
20:14:09.625 INFO SparkContext - Created broadcast 288 from broadcast at DAGScheduler.scala:1580
20:14:09.625 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:09.625 INFO TaskSchedulerImpl - Adding task set 152.0 with 1 tasks resource profile 0
20:14:09.626 INFO TaskSetManager - Starting task 0.0 in stage 152.0 (TID 208) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
20:14:09.626 INFO Executor - Running task 0.0 in stage 152.0 (TID 208)
20:14:09.658 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:09.672 INFO Executor - Finished task 0.0 in stage 152.0 (TID 208). 989 bytes result sent to driver
20:14:09.673 INFO TaskSetManager - Finished task 0.0 in stage 152.0 (TID 208) in 47 ms on localhost (executor driver) (1/1)
20:14:09.673 INFO TaskSchedulerImpl - Removed TaskSet 152.0, whose tasks have all completed, from pool
20:14:09.673 INFO DAGScheduler - ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.066 s
20:14:09.673 INFO DAGScheduler - Job 109 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.673 INFO TaskSchedulerImpl - Killing all running tasks in stage 152: Stage finished
20:14:09.673 INFO DAGScheduler - Job 109 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066479 s
20:14:09.678 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:09.678 INFO DAGScheduler - Got job 110 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:09.678 INFO DAGScheduler - Final stage: ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185)
20:14:09.678 INFO DAGScheduler - Parents of final stage: List()
20:14:09.678 INFO DAGScheduler - Missing parents: List()
20:14:09.678 INFO DAGScheduler - Submitting ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:09.685 INFO MemoryStore - Block broadcast_289 stored as values in memory (estimated size 148.1 KiB, free 1915.6 MiB)
20:14:09.685 INFO MemoryStore - Block broadcast_289_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.5 MiB)
20:14:09.685 INFO BlockManagerInfo - Added broadcast_289_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.0 MiB)
20:14:09.686 INFO SparkContext - Created broadcast 289 from broadcast at DAGScheduler.scala:1580
20:14:09.686 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:09.686 INFO TaskSchedulerImpl - Adding task set 153.0 with 1 tasks resource profile 0
20:14:09.686 INFO TaskSetManager - Starting task 0.0 in stage 153.0 (TID 209) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:09.687 INFO Executor - Running task 0.0 in stage 153.0 (TID 209)
20:14:09.702 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest35686897756114496513.bam:0+236517
20:14:09.711 INFO Executor - Finished task 0.0 in stage 153.0 (TID 209). 1075 bytes result sent to driver
20:14:09.712 INFO TaskSetManager - Finished task 0.0 in stage 153.0 (TID 209) in 25 ms on localhost (executor driver) (1/1)
20:14:09.712 INFO TaskSchedulerImpl - Removed TaskSet 153.0, whose tasks have all completed, from pool
20:14:09.712 INFO BlockManagerInfo - Removed broadcast_277_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.2 MiB)
20:14:09.712 INFO DAGScheduler - ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.034 s
20:14:09.712 INFO DAGScheduler - Job 110 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.712 INFO TaskSchedulerImpl - Killing all running tasks in stage 153: Stage finished
20:14:09.712 INFO DAGScheduler - Job 110 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.034354 s
20:14:09.713 INFO BlockManagerInfo - Removed broadcast_268_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.2 MiB)
20:14:09.714 INFO BlockManagerInfo - Removed broadcast_280_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:09.715 INFO MemoryStore - Block broadcast_290 stored as values in memory (estimated size 576.0 B, free 1916.8 MiB)
20:14:09.715 INFO BlockManagerInfo - Removed broadcast_287_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.3 MiB)
20:14:09.715 INFO MemoryStore - Block broadcast_290_piece0 stored as bytes in memory (estimated size 228.0 B, free 1916.8 MiB)
20:14:09.715 INFO BlockManagerInfo - Added broadcast_290_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.3 MiB)
20:14:09.716 INFO SparkContext - Created broadcast 290 from broadcast at CramSource.java:114
20:14:09.716 INFO BlockManagerInfo - Removed broadcast_275_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:09.716 INFO BlockManagerInfo - Removed broadcast_274_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.4 MiB)
20:14:09.717 INFO MemoryStore - Block broadcast_291 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
20:14:09.717 INFO BlockManagerInfo - Removed broadcast_282_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.4 MiB)
20:14:09.718 INFO BlockManagerInfo - Removed broadcast_288_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.5 MiB)
20:14:09.718 INFO BlockManagerInfo - Removed broadcast_283_piece0 on localhost:35739 in memory (size: 157.6 KiB, free: 1919.7 MiB)
20:14:09.719 INFO BlockManagerInfo - Removed broadcast_278_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.7 MiB)
20:14:09.719 INFO BlockManagerInfo - Removed broadcast_284_piece0 on localhost:35739 in memory (size: 58.4 KiB, free: 1919.8 MiB)
20:14:09.720 INFO BlockManagerInfo - Removed broadcast_281_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.8 MiB)
20:14:09.721 INFO BlockManagerInfo - Removed broadcast_276_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.8 MiB)
20:14:09.726 INFO MemoryStore - Block broadcast_291_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
20:14:09.726 INFO BlockManagerInfo - Added broadcast_291_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:14:09.726 INFO SparkContext - Created broadcast 291 from newAPIHadoopFile at PathSplitSource.java:96
20:14:09.742 INFO MemoryStore - Block broadcast_292 stored as values in memory (estimated size 576.0 B, free 1918.8 MiB)
20:14:09.742 INFO MemoryStore - Block broadcast_292_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.8 MiB)
20:14:09.742 INFO BlockManagerInfo - Added broadcast_292_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.8 MiB)
20:14:09.742 INFO SparkContext - Created broadcast 292 from broadcast at CramSource.java:114
20:14:09.743 INFO MemoryStore - Block broadcast_293 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
20:14:09.749 INFO MemoryStore - Block broadcast_293_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
20:14:09.749 INFO BlockManagerInfo - Added broadcast_293_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:09.749 INFO SparkContext - Created broadcast 293 from newAPIHadoopFile at PathSplitSource.java:96
20:14:09.763 INFO FileInputFormat - Total input files to process : 1
20:14:09.764 INFO MemoryStore - Block broadcast_294 stored as values in memory (estimated size 6.0 KiB, free 1918.4 MiB)
20:14:09.764 INFO MemoryStore - Block broadcast_294_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1918.4 MiB)
20:14:09.764 INFO BlockManagerInfo - Added broadcast_294_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.7 MiB)
20:14:09.764 INFO SparkContext - Created broadcast 294 from broadcast at ReadsSparkSink.java:133
20:14:09.765 INFO MemoryStore - Block broadcast_295 stored as values in memory (estimated size 6.2 KiB, free 1918.4 MiB)
20:14:09.765 INFO MemoryStore - Block broadcast_295_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1918.4 MiB)
20:14:09.765 INFO BlockManagerInfo - Added broadcast_295_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.7 MiB)
20:14:09.766 INFO SparkContext - Created broadcast 295 from broadcast at CramSink.java:76
20:14:09.767 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.767 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.767 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.784 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:09.784 INFO DAGScheduler - Registering RDD 718 (mapToPair at SparkUtils.java:161) as input to shuffle 32
20:14:09.785 INFO DAGScheduler - Got job 111 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:09.785 INFO DAGScheduler - Final stage: ResultStage 155 (runJob at SparkHadoopWriter.scala:83)
20:14:09.785 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 154)
20:14:09.785 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 154)
20:14:09.785 INFO DAGScheduler - Submitting ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:09.796 INFO MemoryStore - Block broadcast_296 stored as values in memory (estimated size 292.8 KiB, free 1918.1 MiB)
20:14:09.797 INFO MemoryStore - Block broadcast_296_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1918.0 MiB)
20:14:09.797 INFO BlockManagerInfo - Added broadcast_296_piece0 in memory on localhost:35739 (size: 107.3 KiB, free: 1919.6 MiB)
20:14:09.797 INFO SparkContext - Created broadcast 296 from broadcast at DAGScheduler.scala:1580
20:14:09.798 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:09.798 INFO TaskSchedulerImpl - Adding task set 154.0 with 1 tasks resource profile 0
20:14:09.798 INFO TaskSetManager - Starting task 0.0 in stage 154.0 (TID 210) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
20:14:09.798 INFO Executor - Running task 0.0 in stage 154.0 (TID 210)
20:14:09.819 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:09.835 INFO Executor - Finished task 0.0 in stage 154.0 (TID 210). 1148 bytes result sent to driver
20:14:09.835 INFO TaskSetManager - Finished task 0.0 in stage 154.0 (TID 210) in 37 ms on localhost (executor driver) (1/1)
20:14:09.835 INFO TaskSchedulerImpl - Removed TaskSet 154.0, whose tasks have all completed, from pool
20:14:09.836 INFO DAGScheduler - ShuffleMapStage 154 (mapToPair at SparkUtils.java:161) finished in 0.051 s
20:14:09.836 INFO DAGScheduler - looking for newly runnable stages
20:14:09.836 INFO DAGScheduler - running: HashSet()
20:14:09.836 INFO DAGScheduler - waiting: HashSet(ResultStage 155)
20:14:09.836 INFO DAGScheduler - failed: HashSet()
20:14:09.836 INFO DAGScheduler - Submitting ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89), which has no missing parents
20:14:09.842 INFO MemoryStore - Block broadcast_297 stored as values in memory (estimated size 153.2 KiB, free 1917.9 MiB)
20:14:09.843 INFO MemoryStore - Block broadcast_297_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1917.8 MiB)
20:14:09.843 INFO BlockManagerInfo - Added broadcast_297_piece0 in memory on localhost:35739 (size: 58.1 KiB, free: 1919.6 MiB)
20:14:09.843 INFO SparkContext - Created broadcast 297 from broadcast at DAGScheduler.scala:1580
20:14:09.843 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
20:14:09.843 INFO TaskSchedulerImpl - Adding task set 155.0 with 1 tasks resource profile 0
20:14:09.844 INFO TaskSetManager - Starting task 0.0 in stage 155.0 (TID 211) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:09.844 INFO Executor - Running task 0.0 in stage 155.0 (TID 211)
20:14:09.850 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:09.850 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:09.860 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.860 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.860 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.860 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:09.860 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:09.860 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:09.930 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014092132071688007286527_0723_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest517170809721572657893.cram.parts/_temporary/0/task_202502102014092132071688007286527_0723_r_000000
20:14:09.930 INFO SparkHadoopMapRedUtil - attempt_202502102014092132071688007286527_0723_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:09.930 INFO Executor - Finished task 0.0 in stage 155.0 (TID 211). 1858 bytes result sent to driver
20:14:09.931 INFO TaskSetManager - Finished task 0.0 in stage 155.0 (TID 211) in 87 ms on localhost (executor driver) (1/1)
20:14:09.931 INFO TaskSchedulerImpl - Removed TaskSet 155.0, whose tasks have all completed, from pool
20:14:09.931 INFO DAGScheduler - ResultStage 155 (runJob at SparkHadoopWriter.scala:83) finished in 0.095 s
20:14:09.931 INFO DAGScheduler - Job 111 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:09.931 INFO TaskSchedulerImpl - Killing all running tasks in stage 155: Stage finished
20:14:09.931 INFO DAGScheduler - Job 111 finished: runJob at SparkHadoopWriter.scala:83, took 0.146873 s
20:14:09.931 INFO SparkHadoopWriter - Start to commit write Job job_202502102014092132071688007286527_0723.
20:14:09.936 INFO SparkHadoopWriter - Write Job job_202502102014092132071688007286527_0723 committed. Elapsed time: 4 ms.
20:14:09.948 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest517170809721572657893.cram
20:14:09.953 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest517170809721572657893.cram done
20:14:09.954 INFO MemoryStore - Block broadcast_298 stored as values in memory (estimated size 504.0 B, free 1917.8 MiB)
20:14:09.955 INFO MemoryStore - Block broadcast_298_piece0 stored as bytes in memory (estimated size 160.0 B, free 1917.8 MiB)
20:14:09.955 INFO BlockManagerInfo - Added broadcast_298_piece0 in memory on localhost:35739 (size: 160.0 B, free: 1919.6 MiB)
20:14:09.955 INFO SparkContext - Created broadcast 298 from broadcast at CramSource.java:114
20:14:09.956 INFO MemoryStore - Block broadcast_299 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
20:14:09.962 INFO MemoryStore - Block broadcast_299_piece0 stored as bytes in memory (estimated size 50.1 KiB, free 1917.5 MiB)
20:14:09.962 INFO BlockManagerInfo - Added broadcast_299_piece0 in memory on localhost:35739 (size: 50.1 KiB, free: 1919.5 MiB)
20:14:09.963 INFO SparkContext - Created broadcast 299 from newAPIHadoopFile at PathSplitSource.java:96
20:14:09.977 INFO FileInputFormat - Total input files to process : 1
20:14:10.001 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:10.002 INFO DAGScheduler - Got job 112 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:10.002 INFO DAGScheduler - Final stage: ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:10.002 INFO DAGScheduler - Parents of final stage: List()
20:14:10.002 INFO DAGScheduler - Missing parents: List()
20:14:10.002 INFO DAGScheduler - Submitting ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:10.013 INFO MemoryStore - Block broadcast_300 stored as values in memory (estimated size 286.8 KiB, free 1917.2 MiB)
20:14:10.014 INFO MemoryStore - Block broadcast_300_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.1 MiB)
20:14:10.014 INFO BlockManagerInfo - Added broadcast_300_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.4 MiB)
20:14:10.014 INFO SparkContext - Created broadcast 300 from broadcast at DAGScheduler.scala:1580
20:14:10.014 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:10.014 INFO TaskSchedulerImpl - Adding task set 156.0 with 1 tasks resource profile 0
20:14:10.015 INFO TaskSetManager - Starting task 0.0 in stage 156.0 (TID 212) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
20:14:10.015 INFO Executor - Running task 0.0 in stage 156.0 (TID 212)
20:14:10.036 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest517170809721572657893.cram:0+43715
20:14:10.058 INFO Executor - Finished task 0.0 in stage 156.0 (TID 212). 154101 bytes result sent to driver
20:14:10.058 INFO TaskSetManager - Finished task 0.0 in stage 156.0 (TID 212) in 43 ms on localhost (executor driver) (1/1)
20:14:10.058 INFO TaskSchedulerImpl - Removed TaskSet 156.0, whose tasks have all completed, from pool
20:14:10.058 INFO DAGScheduler - ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.056 s
20:14:10.059 INFO DAGScheduler - Job 112 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.059 INFO TaskSchedulerImpl - Killing all running tasks in stage 156: Stage finished
20:14:10.059 INFO DAGScheduler - Job 112 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.057259 s
20:14:10.064 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:10.064 INFO DAGScheduler - Got job 113 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:10.064 INFO DAGScheduler - Final stage: ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185)
20:14:10.064 INFO DAGScheduler - Parents of final stage: List()
20:14:10.064 INFO DAGScheduler - Missing parents: List()
20:14:10.064 INFO DAGScheduler - Submitting ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:10.084 INFO MemoryStore - Block broadcast_301 stored as values in memory (estimated size 286.8 KiB, free 1916.8 MiB)
20:14:10.085 INFO MemoryStore - Block broadcast_301_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1916.7 MiB)
20:14:10.085 INFO BlockManagerInfo - Added broadcast_301_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.3 MiB)
20:14:10.085 INFO SparkContext - Created broadcast 301 from broadcast at DAGScheduler.scala:1580
20:14:10.086 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:10.086 INFO TaskSchedulerImpl - Adding task set 157.0 with 1 tasks resource profile 0
20:14:10.086 INFO TaskSetManager - Starting task 0.0 in stage 157.0 (TID 213) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
20:14:10.086 INFO Executor - Running task 0.0 in stage 157.0 (TID 213)
20:14:10.106 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:10.113 INFO Executor - Finished task 0.0 in stage 157.0 (TID 213). 989 bytes result sent to driver
20:14:10.113 INFO TaskSetManager - Finished task 0.0 in stage 157.0 (TID 213) in 27 ms on localhost (executor driver) (1/1)
20:14:10.114 INFO TaskSchedulerImpl - Removed TaskSet 157.0, whose tasks have all completed, from pool
20:14:10.114 INFO DAGScheduler - ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.050 s
20:14:10.114 INFO DAGScheduler - Job 113 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.114 INFO TaskSchedulerImpl - Killing all running tasks in stage 157: Stage finished
20:14:10.114 INFO DAGScheduler - Job 113 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.050048 s
20:14:10.118 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:10.119 INFO DAGScheduler - Got job 114 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:10.119 INFO DAGScheduler - Final stage: ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185)
20:14:10.119 INFO DAGScheduler - Parents of final stage: List()
20:14:10.119 INFO DAGScheduler - Missing parents: List()
20:14:10.119 INFO DAGScheduler - Submitting ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:10.130 INFO MemoryStore - Block broadcast_302 stored as values in memory (estimated size 286.8 KiB, free 1916.4 MiB)
20:14:10.137 INFO MemoryStore - Block broadcast_302_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1916.3 MiB)
20:14:10.137 INFO BlockManagerInfo - Added broadcast_302_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.2 MiB)
20:14:10.137 INFO BlockManagerInfo - Removed broadcast_289_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.3 MiB)
20:14:10.137 INFO SparkContext - Created broadcast 302 from broadcast at DAGScheduler.scala:1580
20:14:10.137 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:10.137 INFO TaskSchedulerImpl - Adding task set 158.0 with 1 tasks resource profile 0
20:14:10.138 INFO BlockManagerInfo - Removed broadcast_294_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.3 MiB)
20:14:10.138 INFO TaskSetManager - Starting task 0.0 in stage 158.0 (TID 214) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
20:14:10.138 INFO BlockManagerInfo - Removed broadcast_295_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.3 MiB)
20:14:10.138 INFO Executor - Running task 0.0 in stage 158.0 (TID 214)
20:14:10.140 INFO BlockManagerInfo - Removed broadcast_285_piece0 on localhost:35739 in memory (size: 231.0 B, free: 1919.3 MiB)
20:14:10.140 INFO BlockManagerInfo - Removed broadcast_297_piece0 on localhost:35739 in memory (size: 58.1 KiB, free: 1919.3 MiB)
20:14:10.141 INFO BlockManagerInfo - Removed broadcast_301_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.4 MiB)
20:14:10.141 INFO BlockManagerInfo - Removed broadcast_296_piece0 on localhost:35739 in memory (size: 107.3 KiB, free: 1919.6 MiB)
20:14:10.142 INFO BlockManagerInfo - Removed broadcast_300_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.7 MiB)
20:14:10.142 INFO BlockManagerInfo - Removed broadcast_292_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.7 MiB)
20:14:10.143 INFO BlockManagerInfo - Removed broadcast_279_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:10.143 INFO BlockManagerInfo - Removed broadcast_286_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:10.144 INFO BlockManagerInfo - Removed broadcast_293_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:10.161 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest517170809721572657893.cram:0+43715
20:14:10.172 INFO Executor - Finished task 0.0 in stage 158.0 (TID 214). 989 bytes result sent to driver
20:14:10.172 INFO TaskSetManager - Finished task 0.0 in stage 158.0 (TID 214) in 34 ms on localhost (executor driver) (1/1)
20:14:10.172 INFO TaskSchedulerImpl - Removed TaskSet 158.0, whose tasks have all completed, from pool
20:14:10.173 INFO DAGScheduler - ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.054 s
20:14:10.173 INFO DAGScheduler - Job 114 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.173 INFO TaskSchedulerImpl - Killing all running tasks in stage 158: Stage finished
20:14:10.173 INFO DAGScheduler - Job 114 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.054349 s
20:14:10.176 INFO MemoryStore - Block broadcast_303 stored as values in memory (estimated size 297.9 KiB, free 1918.6 MiB)
20:14:10.182 INFO MemoryStore - Block broadcast_303_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.6 MiB)
20:14:10.182 INFO BlockManagerInfo - Added broadcast_303_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:14:10.182 INFO SparkContext - Created broadcast 303 from newAPIHadoopFile at PathSplitSource.java:96
20:14:10.204 INFO MemoryStore - Block broadcast_304 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:10.210 INFO MemoryStore - Block broadcast_304_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.3 MiB)
20:14:10.210 INFO BlockManagerInfo - Added broadcast_304_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:10.210 INFO SparkContext - Created broadcast 304 from newAPIHadoopFile at PathSplitSource.java:96
20:14:10.230 INFO FileInputFormat - Total input files to process : 1
20:14:10.232 INFO MemoryStore - Block broadcast_305 stored as values in memory (estimated size 160.7 KiB, free 1918.1 MiB)
20:14:10.233 INFO MemoryStore - Block broadcast_305_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.1 MiB)
20:14:10.233 INFO BlockManagerInfo - Added broadcast_305_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.7 MiB)
20:14:10.233 INFO SparkContext - Created broadcast 305 from broadcast at ReadsSparkSink.java:133
20:14:10.236 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:10.236 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:10.236 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:10.254 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:10.254 INFO DAGScheduler - Registering RDD 743 (mapToPair at SparkUtils.java:161) as input to shuffle 33
20:14:10.254 INFO DAGScheduler - Got job 115 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:10.254 INFO DAGScheduler - Final stage: ResultStage 160 (runJob at SparkHadoopWriter.scala:83)
20:14:10.254 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 159)
20:14:10.254 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 159)
20:14:10.254 INFO DAGScheduler - Submitting ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:10.271 INFO MemoryStore - Block broadcast_306 stored as values in memory (estimated size 520.4 KiB, free 1917.6 MiB)
20:14:10.273 INFO MemoryStore - Block broadcast_306_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.4 MiB)
20:14:10.273 INFO BlockManagerInfo - Added broadcast_306_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.5 MiB)
20:14:10.273 INFO SparkContext - Created broadcast 306 from broadcast at DAGScheduler.scala:1580
20:14:10.273 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:10.273 INFO TaskSchedulerImpl - Adding task set 159.0 with 1 tasks resource profile 0
20:14:10.274 INFO TaskSetManager - Starting task 0.0 in stage 159.0 (TID 215) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:10.274 INFO Executor - Running task 0.0 in stage 159.0 (TID 215)
20:14:10.303 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:10.322 INFO Executor - Finished task 0.0 in stage 159.0 (TID 215). 1148 bytes result sent to driver
20:14:10.322 INFO TaskSetManager - Finished task 0.0 in stage 159.0 (TID 215) in 48 ms on localhost (executor driver) (1/1)
20:14:10.322 INFO TaskSchedulerImpl - Removed TaskSet 159.0, whose tasks have all completed, from pool
20:14:10.323 INFO DAGScheduler - ShuffleMapStage 159 (mapToPair at SparkUtils.java:161) finished in 0.068 s
20:14:10.323 INFO DAGScheduler - looking for newly runnable stages
20:14:10.323 INFO DAGScheduler - running: HashSet()
20:14:10.323 INFO DAGScheduler - waiting: HashSet(ResultStage 160)
20:14:10.323 INFO DAGScheduler - failed: HashSet()
20:14:10.323 INFO DAGScheduler - Submitting ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65), which has no missing parents
20:14:10.334 INFO MemoryStore - Block broadcast_307 stored as values in memory (estimated size 241.1 KiB, free 1917.2 MiB)
20:14:10.335 INFO MemoryStore - Block broadcast_307_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1917.1 MiB)
20:14:10.335 INFO BlockManagerInfo - Added broadcast_307_piece0 in memory on localhost:35739 (size: 66.9 KiB, free: 1919.5 MiB)
20:14:10.335 INFO SparkContext - Created broadcast 307 from broadcast at DAGScheduler.scala:1580
20:14:10.335 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
20:14:10.335 INFO TaskSchedulerImpl - Adding task set 160.0 with 1 tasks resource profile 0
20:14:10.336 INFO TaskSetManager - Starting task 0.0 in stage 160.0 (TID 216) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:10.336 INFO Executor - Running task 0.0 in stage 160.0 (TID 216)
20:14:10.340 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:10.340 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:10.351 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:10.351 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:10.351 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:10.367 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014101028499215075431434_0749_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest6522970853234000696.sam.parts/_temporary/0/task_202502102014101028499215075431434_0749_m_000000
20:14:10.367 INFO SparkHadoopMapRedUtil - attempt_202502102014101028499215075431434_0749_m_000000_0: Committed. Elapsed time: 0 ms.
20:14:10.368 INFO Executor - Finished task 0.0 in stage 160.0 (TID 216). 1858 bytes result sent to driver
20:14:10.368 INFO TaskSetManager - Finished task 0.0 in stage 160.0 (TID 216) in 32 ms on localhost (executor driver) (1/1)
20:14:10.368 INFO TaskSchedulerImpl - Removed TaskSet 160.0, whose tasks have all completed, from pool
20:14:10.368 INFO DAGScheduler - ResultStage 160 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
20:14:10.369 INFO DAGScheduler - Job 115 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.369 INFO TaskSchedulerImpl - Killing all running tasks in stage 160: Stage finished
20:14:10.369 INFO DAGScheduler - Job 115 finished: runJob at SparkHadoopWriter.scala:83, took 0.115110 s
20:14:10.369 INFO SparkHadoopWriter - Start to commit write Job job_202502102014101028499215075431434_0749.
20:14:10.373 INFO SparkHadoopWriter - Write Job job_202502102014101028499215075431434_0749 committed. Elapsed time: 4 ms.
20:14:10.381 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest6522970853234000696.sam
20:14:10.386 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest6522970853234000696.sam done
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:10.389 INFO MemoryStore - Block broadcast_308 stored as values in memory (estimated size 160.7 KiB, free 1917.0 MiB)
20:14:10.390 INFO MemoryStore - Block broadcast_308_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.0 MiB)
20:14:10.390 INFO BlockManagerInfo - Added broadcast_308_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:14:10.390 INFO SparkContext - Created broadcast 308 from broadcast at SamSource.java:78
20:14:10.391 INFO MemoryStore - Block broadcast_309 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
20:14:10.397 INFO MemoryStore - Block broadcast_309_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
20:14:10.397 INFO BlockManagerInfo - Added broadcast_309_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:10.398 INFO SparkContext - Created broadcast 309 from newAPIHadoopFile at SamSource.java:108
20:14:10.400 INFO FileInputFormat - Total input files to process : 1
20:14:10.403 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:10.403 INFO DAGScheduler - Got job 116 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:10.403 INFO DAGScheduler - Final stage: ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:10.403 INFO DAGScheduler - Parents of final stage: List()
20:14:10.404 INFO DAGScheduler - Missing parents: List()
20:14:10.404 INFO DAGScheduler - Submitting ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:10.404 INFO MemoryStore - Block broadcast_310 stored as values in memory (estimated size 7.5 KiB, free 1916.6 MiB)
20:14:10.404 INFO MemoryStore - Block broadcast_310_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1916.6 MiB)
20:14:10.405 INFO BlockManagerInfo - Added broadcast_310_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.4 MiB)
20:14:10.405 INFO SparkContext - Created broadcast 310 from broadcast at DAGScheduler.scala:1580
20:14:10.405 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:10.405 INFO TaskSchedulerImpl - Adding task set 161.0 with 1 tasks resource profile 0
20:14:10.405 INFO TaskSetManager - Starting task 0.0 in stage 161.0 (TID 217) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7808 bytes)
20:14:10.406 INFO Executor - Running task 0.0 in stage 161.0 (TID 217)
20:14:10.407 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest6522970853234000696.sam:0+847558
20:14:10.418 INFO Executor - Finished task 0.0 in stage 161.0 (TID 217). 651483 bytes result sent to driver
20:14:10.420 INFO TaskSetManager - Finished task 0.0 in stage 161.0 (TID 217) in 15 ms on localhost (executor driver) (1/1)
20:14:10.420 INFO TaskSchedulerImpl - Removed TaskSet 161.0, whose tasks have all completed, from pool
20:14:10.420 INFO DAGScheduler - ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.016 s
20:14:10.420 INFO DAGScheduler - Job 116 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.420 INFO TaskSchedulerImpl - Killing all running tasks in stage 161: Stage finished
20:14:10.420 INFO DAGScheduler - Job 116 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.016803 s
20:14:10.435 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:10.435 INFO DAGScheduler - Got job 117 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:10.435 INFO DAGScheduler - Final stage: ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185)
20:14:10.435 INFO DAGScheduler - Parents of final stage: List()
20:14:10.435 INFO DAGScheduler - Missing parents: List()
20:14:10.436 INFO DAGScheduler - Submitting ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:10.452 INFO MemoryStore - Block broadcast_311 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
20:14:10.454 INFO MemoryStore - Block broadcast_311_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.0 MiB)
20:14:10.454 INFO BlockManagerInfo - Added broadcast_311_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.3 MiB)
20:14:10.454 INFO SparkContext - Created broadcast 311 from broadcast at DAGScheduler.scala:1580
20:14:10.454 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:10.455 INFO TaskSchedulerImpl - Adding task set 162.0 with 1 tasks resource profile 0
20:14:10.455 INFO TaskSetManager - Starting task 0.0 in stage 162.0 (TID 218) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:10.456 INFO Executor - Running task 0.0 in stage 162.0 (TID 218)
20:14:10.486 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:10.495 INFO Executor - Finished task 0.0 in stage 162.0 (TID 218). 989 bytes result sent to driver
20:14:10.496 INFO TaskSetManager - Finished task 0.0 in stage 162.0 (TID 218) in 40 ms on localhost (executor driver) (1/1)
20:14:10.496 INFO TaskSchedulerImpl - Removed TaskSet 162.0, whose tasks have all completed, from pool
20:14:10.496 INFO DAGScheduler - ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
20:14:10.496 INFO DAGScheduler - Job 117 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.496 INFO TaskSchedulerImpl - Killing all running tasks in stage 162: Stage finished
20:14:10.496 INFO DAGScheduler - Job 117 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060827 s
20:14:10.501 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:10.501 INFO DAGScheduler - Got job 118 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:10.501 INFO DAGScheduler - Final stage: ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185)
20:14:10.501 INFO DAGScheduler - Parents of final stage: List()
20:14:10.501 INFO DAGScheduler - Missing parents: List()
20:14:10.501 INFO DAGScheduler - Submitting ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:10.502 INFO MemoryStore - Block broadcast_312 stored as values in memory (estimated size 7.4 KiB, free 1916.0 MiB)
20:14:10.502 INFO MemoryStore - Block broadcast_312_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1916.0 MiB)
20:14:10.502 INFO BlockManagerInfo - Added broadcast_312_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.2 MiB)
20:14:10.502 INFO SparkContext - Created broadcast 312 from broadcast at DAGScheduler.scala:1580
20:14:10.502 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:10.502 INFO TaskSchedulerImpl - Adding task set 163.0 with 1 tasks resource profile 0
20:14:10.503 INFO TaskSetManager - Starting task 0.0 in stage 163.0 (TID 219) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7808 bytes)
20:14:10.503 INFO Executor - Running task 0.0 in stage 163.0 (TID 219)
20:14:10.504 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest6522970853234000696.sam:0+847558
20:14:10.511 INFO Executor - Finished task 0.0 in stage 163.0 (TID 219). 946 bytes result sent to driver
20:14:10.511 INFO TaskSetManager - Finished task 0.0 in stage 163.0 (TID 219) in 8 ms on localhost (executor driver) (1/1)
20:14:10.511 INFO TaskSchedulerImpl - Removed TaskSet 163.0, whose tasks have all completed, from pool
20:14:10.512 INFO DAGScheduler - ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.011 s
20:14:10.512 INFO DAGScheduler - Job 118 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.512 INFO TaskSchedulerImpl - Killing all running tasks in stage 163: Stage finished
20:14:10.512 INFO DAGScheduler - Job 118 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.011121 s
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:10.515 INFO MemoryStore - Block broadcast_313 stored as values in memory (estimated size 21.0 KiB, free 1916.0 MiB)
20:14:10.515 INFO MemoryStore - Block broadcast_313_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1916.0 MiB)
20:14:10.515 INFO BlockManagerInfo - Added broadcast_313_piece0 in memory on localhost:35739 (size: 2.4 KiB, free: 1919.2 MiB)
20:14:10.516 INFO SparkContext - Created broadcast 313 from broadcast at SamSource.java:78
20:14:10.517 INFO MemoryStore - Block broadcast_314 stored as values in memory (estimated size 298.0 KiB, free 1915.7 MiB)
20:14:10.526 INFO MemoryStore - Block broadcast_314_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1915.7 MiB)
20:14:10.526 INFO BlockManagerInfo - Added broadcast_314_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.2 MiB)
20:14:10.527 INFO SparkContext - Created broadcast 314 from newAPIHadoopFile at SamSource.java:108
20:14:10.532 INFO FileInputFormat - Total input files to process : 1
20:14:10.536 INFO SparkContext - Starting job: collect at SparkUtils.java:205
20:14:10.537 INFO DAGScheduler - Got job 119 (collect at SparkUtils.java:205) with 1 output partitions
20:14:10.537 INFO DAGScheduler - Final stage: ResultStage 164 (collect at SparkUtils.java:205)
20:14:10.537 INFO DAGScheduler - Parents of final stage: List()
20:14:10.537 INFO DAGScheduler - Missing parents: List()
20:14:10.537 INFO DAGScheduler - Submitting ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188), which has no missing parents
20:14:10.538 INFO MemoryStore - Block broadcast_315 stored as values in memory (estimated size 7.9 KiB, free 1915.7 MiB)
20:14:10.543 INFO MemoryStore - Block broadcast_315_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1915.7 MiB)
20:14:10.543 INFO BlockManagerInfo - Added broadcast_315_piece0 in memory on localhost:35739 (size: 3.9 KiB, free: 1919.2 MiB)
20:14:10.543 INFO BlockManagerInfo - Removed broadcast_305_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.2 MiB)
20:14:10.543 INFO SparkContext - Created broadcast 315 from broadcast at DAGScheduler.scala:1580
20:14:10.543 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
20:14:10.543 INFO TaskSchedulerImpl - Adding task set 164.0 with 1 tasks resource profile 0
20:14:10.544 INFO BlockManagerInfo - Removed broadcast_290_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.2 MiB)
20:14:10.544 INFO BlockManagerInfo - Removed broadcast_302_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.3 MiB)
20:14:10.544 INFO TaskSetManager - Starting task 0.0 in stage 164.0 (TID 220) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
20:14:10.544 INFO Executor - Running task 0.0 in stage 164.0 (TID 220)
20:14:10.545 INFO BlockManagerInfo - Removed broadcast_291_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:10.545 INFO BlockManagerInfo - Removed broadcast_299_piece0 on localhost:35739 in memory (size: 50.1 KiB, free: 1919.4 MiB)
20:14:10.545 INFO BlockManagerInfo - Removed broadcast_304_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:10.546 INFO BlockManagerInfo - Removed broadcast_311_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.6 MiB)
20:14:10.546 INFO BlockManagerInfo - Removed broadcast_306_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.8 MiB)
20:14:10.546 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
20:14:10.547 INFO BlockManagerInfo - Removed broadcast_307_piece0 on localhost:35739 in memory (size: 66.9 KiB, free: 1919.8 MiB)
20:14:10.547 INFO BlockManagerInfo - Removed broadcast_312_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.8 MiB)
20:14:10.548 INFO BlockManagerInfo - Removed broadcast_308_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:10.549 INFO BlockManagerInfo - Removed broadcast_303_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:10.549 INFO Executor - Finished task 0.0 in stage 164.0 (TID 220). 1700 bytes result sent to driver
20:14:10.549 INFO BlockManagerInfo - Removed broadcast_309_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:10.550 INFO BlockManagerInfo - Removed broadcast_298_piece0 on localhost:35739 in memory (size: 160.0 B, free: 1919.9 MiB)
20:14:10.550 INFO BlockManagerInfo - Removed broadcast_310_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.9 MiB)
20:14:10.550 INFO TaskSetManager - Finished task 0.0 in stage 164.0 (TID 220) in 6 ms on localhost (executor driver) (1/1)
20:14:10.551 INFO TaskSchedulerImpl - Removed TaskSet 164.0, whose tasks have all completed, from pool
20:14:10.551 INFO DAGScheduler - ResultStage 164 (collect at SparkUtils.java:205) finished in 0.014 s
20:14:10.551 INFO DAGScheduler - Job 119 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.551 INFO TaskSchedulerImpl - Killing all running tasks in stage 164: Stage finished
20:14:10.551 INFO DAGScheduler - Job 119 finished: collect at SparkUtils.java:205, took 0.014330 s
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:10.555 INFO MemoryStore - Block broadcast_316 stored as values in memory (estimated size 21.0 KiB, free 1919.6 MiB)
20:14:10.556 INFO MemoryStore - Block broadcast_316_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1919.6 MiB)
20:14:10.556 INFO BlockManagerInfo - Added broadcast_316_piece0 in memory on localhost:35739 (size: 2.4 KiB, free: 1919.9 MiB)
20:14:10.556 INFO SparkContext - Created broadcast 316 from broadcast at SamSource.java:78
20:14:10.557 INFO MemoryStore - Block broadcast_317 stored as values in memory (estimated size 298.0 KiB, free 1919.3 MiB)
20:14:10.563 INFO MemoryStore - Block broadcast_317_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1919.3 MiB)
20:14:10.563 INFO BlockManagerInfo - Added broadcast_317_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.9 MiB)
20:14:10.563 INFO SparkContext - Created broadcast 317 from newAPIHadoopFile at SamSource.java:108
20:14:10.565 INFO MemoryStore - Block broadcast_318 stored as values in memory (estimated size 21.0 KiB, free 1919.2 MiB)
20:14:10.565 INFO MemoryStore - Block broadcast_318_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1919.2 MiB)
20:14:10.566 INFO BlockManagerInfo - Added broadcast_318_piece0 in memory on localhost:35739 (size: 2.4 KiB, free: 1919.9 MiB)
20:14:10.566 INFO SparkContext - Created broadcast 318 from broadcast at ReadsSparkSink.java:133
20:14:10.566 INFO MemoryStore - Block broadcast_319 stored as values in memory (estimated size 21.5 KiB, free 1919.2 MiB)
20:14:10.567 INFO MemoryStore - Block broadcast_319_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1919.2 MiB)
20:14:10.567 INFO BlockManagerInfo - Added broadcast_319_piece0 in memory on localhost:35739 (size: 2.4 KiB, free: 1919.9 MiB)
20:14:10.567 INFO SparkContext - Created broadcast 319 from broadcast at BamSink.java:76
20:14:10.569 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:10.569 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:10.569 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:10.586 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:10.587 INFO DAGScheduler - Got job 120 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:10.587 INFO DAGScheduler - Final stage: ResultStage 165 (runJob at SparkHadoopWriter.scala:83)
20:14:10.587 INFO DAGScheduler - Parents of final stage: List()
20:14:10.587 INFO DAGScheduler - Missing parents: List()
20:14:10.587 INFO DAGScheduler - Submitting ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91), which has no missing parents
20:14:10.593 INFO MemoryStore - Block broadcast_320 stored as values in memory (estimated size 152.3 KiB, free 1919.1 MiB)
20:14:10.594 INFO MemoryStore - Block broadcast_320_piece0 stored as bytes in memory (estimated size 56.4 KiB, free 1919.0 MiB)
20:14:10.594 INFO BlockManagerInfo - Added broadcast_320_piece0 in memory on localhost:35739 (size: 56.4 KiB, free: 1919.8 MiB)
20:14:10.594 INFO SparkContext - Created broadcast 320 from broadcast at DAGScheduler.scala:1580
20:14:10.594 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:10.594 INFO TaskSchedulerImpl - Adding task set 165.0 with 1 tasks resource profile 0
20:14:10.595 INFO TaskSetManager - Starting task 0.0 in stage 165.0 (TID 221) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
20:14:10.595 INFO Executor - Running task 0.0 in stage 165.0 (TID 221)
20:14:10.599 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
20:14:10.602 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:10.602 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:10.602 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:10.602 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:10.602 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:10.602 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:10.630 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014107911318140404118226_0770_r_000000_0' to file:/tmp/ReadsSparkSinkNotSorting8921230591462751094.bam.parts/_temporary/0/task_202502102014107911318140404118226_0770_r_000000
20:14:10.630 INFO SparkHadoopMapRedUtil - attempt_202502102014107911318140404118226_0770_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:10.630 INFO Executor - Finished task 0.0 in stage 165.0 (TID 221). 1084 bytes result sent to driver
20:14:10.631 INFO TaskSetManager - Finished task 0.0 in stage 165.0 (TID 221) in 36 ms on localhost (executor driver) (1/1)
20:14:10.631 INFO TaskSchedulerImpl - Removed TaskSet 165.0, whose tasks have all completed, from pool
20:14:10.631 INFO DAGScheduler - ResultStage 165 (runJob at SparkHadoopWriter.scala:83) finished in 0.044 s
20:14:10.631 INFO DAGScheduler - Job 120 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.631 INFO TaskSchedulerImpl - Killing all running tasks in stage 165: Stage finished
20:14:10.631 INFO DAGScheduler - Job 120 finished: runJob at SparkHadoopWriter.scala:83, took 0.044538 s
20:14:10.631 INFO SparkHadoopWriter - Start to commit write Job job_202502102014107911318140404118226_0770.
20:14:10.636 INFO SparkHadoopWriter - Write Job job_202502102014107911318140404118226_0770 committed. Elapsed time: 4 ms.
20:14:10.649 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkNotSorting8921230591462751094.bam
20:14:10.653 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkNotSorting8921230591462751094.bam done
20:14:10.653 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkNotSorting8921230591462751094.bam.parts/ to /tmp/ReadsSparkSinkNotSorting8921230591462751094.bam.sbi
20:14:10.658 INFO IndexFileMerger - Done merging .sbi files
20:14:10.659 INFO MemoryStore - Block broadcast_321 stored as values in memory (estimated size 192.0 B, free 1919.0 MiB)
20:14:10.659 INFO MemoryStore - Block broadcast_321_piece0 stored as bytes in memory (estimated size 127.0 B, free 1919.0 MiB)
20:14:10.659 INFO BlockManagerInfo - Added broadcast_321_piece0 in memory on localhost:35739 (size: 127.0 B, free: 1919.8 MiB)
20:14:10.659 INFO SparkContext - Created broadcast 321 from broadcast at BamSource.java:104
20:14:10.661 INFO MemoryStore - Block broadcast_322 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
20:14:10.671 INFO MemoryStore - Block broadcast_322_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
20:14:10.671 INFO BlockManagerInfo - Added broadcast_322_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:14:10.672 INFO SparkContext - Created broadcast 322 from newAPIHadoopFile at PathSplitSource.java:96
20:14:10.686 INFO FileInputFormat - Total input files to process : 1
20:14:10.711 INFO SparkContext - Starting job: collect at SparkUtils.java:205
20:14:10.711 INFO DAGScheduler - Got job 121 (collect at SparkUtils.java:205) with 1 output partitions
20:14:10.711 INFO DAGScheduler - Final stage: ResultStage 166 (collect at SparkUtils.java:205)
20:14:10.711 INFO DAGScheduler - Parents of final stage: List()
20:14:10.711 INFO DAGScheduler - Missing parents: List()
20:14:10.711 INFO DAGScheduler - Submitting ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188), which has no missing parents
20:14:10.718 INFO MemoryStore - Block broadcast_323 stored as values in memory (estimated size 148.6 KiB, free 1918.5 MiB)
20:14:10.718 INFO MemoryStore - Block broadcast_323_piece0 stored as bytes in memory (estimated size 54.7 KiB, free 1918.5 MiB)
20:14:10.719 INFO BlockManagerInfo - Added broadcast_323_piece0 in memory on localhost:35739 (size: 54.7 KiB, free: 1919.7 MiB)
20:14:10.719 INFO SparkContext - Created broadcast 323 from broadcast at DAGScheduler.scala:1580
20:14:10.719 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
20:14:10.719 INFO TaskSchedulerImpl - Adding task set 166.0 with 1 tasks resource profile 0
20:14:10.719 INFO TaskSetManager - Starting task 0.0 in stage 166.0 (TID 222) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
20:14:10.720 INFO Executor - Running task 0.0 in stage 166.0 (TID 222)
20:14:10.731 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting8921230591462751094.bam:0+59395
20:14:10.732 INFO Executor - Finished task 0.0 in stage 166.0 (TID 222). 1700 bytes result sent to driver
20:14:10.733 INFO TaskSetManager - Finished task 0.0 in stage 166.0 (TID 222) in 14 ms on localhost (executor driver) (1/1)
20:14:10.733 INFO TaskSchedulerImpl - Removed TaskSet 166.0, whose tasks have all completed, from pool
20:14:10.733 INFO DAGScheduler - ResultStage 166 (collect at SparkUtils.java:205) finished in 0.021 s
20:14:10.733 INFO DAGScheduler - Job 121 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.733 INFO TaskSchedulerImpl - Killing all running tasks in stage 166: Stage finished
20:14:10.733 INFO DAGScheduler - Job 121 finished: collect at SparkUtils.java:205, took 0.022275 s
20:14:10.749 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:91
20:14:10.750 INFO DAGScheduler - Got job 122 (collect at ReadsSparkSinkUnitTest.java:91) with 1 output partitions
20:14:10.750 INFO DAGScheduler - Final stage: ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91)
20:14:10.750 INFO DAGScheduler - Parents of final stage: List()
20:14:10.750 INFO DAGScheduler - Missing parents: List()
20:14:10.750 INFO DAGScheduler - Submitting ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244), which has no missing parents
20:14:10.756 INFO MemoryStore - Block broadcast_324 stored as values in memory (estimated size 149.8 KiB, free 1918.3 MiB)
20:14:10.757 INFO MemoryStore - Block broadcast_324_piece0 stored as bytes in memory (estimated size 55.2 KiB, free 1918.3 MiB)
20:14:10.757 INFO BlockManagerInfo - Added broadcast_324_piece0 in memory on localhost:35739 (size: 55.2 KiB, free: 1919.7 MiB)
20:14:10.757 INFO SparkContext - Created broadcast 324 from broadcast at DAGScheduler.scala:1580
20:14:10.757 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
20:14:10.757 INFO TaskSchedulerImpl - Adding task set 167.0 with 1 tasks resource profile 0
20:14:10.758 INFO TaskSetManager - Starting task 0.0 in stage 167.0 (TID 223) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8435 bytes)
20:14:10.758 INFO Executor - Running task 0.0 in stage 167.0 (TID 223)
20:14:10.769 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting8921230591462751094.bam:0+59395
20:14:10.771 INFO Executor - Finished task 0.0 in stage 167.0 (TID 223). 192451 bytes result sent to driver
20:14:10.772 INFO TaskSetManager - Finished task 0.0 in stage 167.0 (TID 223) in 15 ms on localhost (executor driver) (1/1)
20:14:10.772 INFO TaskSchedulerImpl - Removed TaskSet 167.0, whose tasks have all completed, from pool
20:14:10.773 INFO DAGScheduler - ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91) finished in 0.023 s
20:14:10.773 INFO DAGScheduler - Job 122 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.773 INFO TaskSchedulerImpl - Killing all running tasks in stage 167: Stage finished
20:14:10.773 INFO DAGScheduler - Job 122 finished: collect at ReadsSparkSinkUnitTest.java:91, took 0.023347 s
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-02-10 20:14:10 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:10.774 INFO MemoryStore - Block broadcast_325 stored as values in memory (estimated size 21.0 KiB, free 1918.3 MiB)
20:14:10.775 INFO MemoryStore - Block broadcast_325_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1918.3 MiB)
20:14:10.775 INFO BlockManagerInfo - Added broadcast_325_piece0 in memory on localhost:35739 (size: 2.4 KiB, free: 1919.7 MiB)
20:14:10.775 INFO SparkContext - Created broadcast 325 from broadcast at SamSource.java:78
20:14:10.776 INFO MemoryStore - Block broadcast_326 stored as values in memory (estimated size 298.0 KiB, free 1918.0 MiB)
20:14:10.782 INFO MemoryStore - Block broadcast_326_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.9 MiB)
20:14:10.782 INFO BlockManagerInfo - Added broadcast_326_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.6 MiB)
20:14:10.782 INFO SparkContext - Created broadcast 326 from newAPIHadoopFile at SamSource.java:108
20:14:10.785 INFO FileInputFormat - Total input files to process : 1
20:14:10.789 INFO SparkContext - Starting job: collect at SparkUtils.java:205
20:14:10.789 INFO DAGScheduler - Got job 123 (collect at SparkUtils.java:205) with 1 output partitions
20:14:10.789 INFO DAGScheduler - Final stage: ResultStage 168 (collect at SparkUtils.java:205)
20:14:10.789 INFO DAGScheduler - Parents of final stage: List()
20:14:10.789 INFO DAGScheduler - Missing parents: List()
20:14:10.789 INFO DAGScheduler - Submitting ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188), which has no missing parents
20:14:10.790 INFO MemoryStore - Block broadcast_327 stored as values in memory (estimated size 7.9 KiB, free 1917.9 MiB)
20:14:10.790 INFO MemoryStore - Block broadcast_327_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1917.9 MiB)
20:14:10.790 INFO BlockManagerInfo - Added broadcast_327_piece0 in memory on localhost:35739 (size: 3.9 KiB, free: 1919.6 MiB)
20:14:10.790 INFO SparkContext - Created broadcast 327 from broadcast at DAGScheduler.scala:1580
20:14:10.791 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
20:14:10.791 INFO TaskSchedulerImpl - Adding task set 168.0 with 1 tasks resource profile 0
20:14:10.791 INFO TaskSetManager - Starting task 0.0 in stage 168.0 (TID 224) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
20:14:10.791 INFO Executor - Running task 0.0 in stage 168.0 (TID 224)
20:14:10.792 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
20:14:10.794 INFO Executor - Finished task 0.0 in stage 168.0 (TID 224). 1657 bytes result sent to driver
20:14:10.795 INFO TaskSetManager - Finished task 0.0 in stage 168.0 (TID 224) in 4 ms on localhost (executor driver) (1/1)
20:14:10.795 INFO TaskSchedulerImpl - Removed TaskSet 168.0, whose tasks have all completed, from pool
20:14:10.795 INFO DAGScheduler - ResultStage 168 (collect at SparkUtils.java:205) finished in 0.006 s
20:14:10.795 INFO DAGScheduler - Job 123 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.795 INFO TaskSchedulerImpl - Killing all running tasks in stage 168: Stage finished
20:14:10.795 INFO DAGScheduler - Job 123 finished: collect at SparkUtils.java:205, took 0.006075 s
20:14:10.802 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:94
20:14:10.802 INFO DAGScheduler - Got job 124 (collect at ReadsSparkSinkUnitTest.java:94) with 1 output partitions
20:14:10.802 INFO DAGScheduler - Final stage: ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94)
20:14:10.802 INFO DAGScheduler - Parents of final stage: List()
20:14:10.802 INFO DAGScheduler - Missing parents: List()
20:14:10.802 INFO DAGScheduler - Submitting ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244), which has no missing parents
20:14:10.803 INFO MemoryStore - Block broadcast_328 stored as values in memory (estimated size 9.6 KiB, free 1917.9 MiB)
20:14:10.808 INFO MemoryStore - Block broadcast_328_piece0 stored as bytes in memory (estimated size 4.4 KiB, free 1917.9 MiB)
20:14:10.808 INFO BlockManagerInfo - Added broadcast_328_piece0 in memory on localhost:35739 (size: 4.4 KiB, free: 1919.6 MiB)
20:14:10.809 INFO SparkContext - Created broadcast 328 from broadcast at DAGScheduler.scala:1580
20:14:10.809 INFO BlockManagerInfo - Removed broadcast_317_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.7 MiB)
20:14:10.809 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
20:14:10.809 INFO TaskSchedulerImpl - Adding task set 169.0 with 1 tasks resource profile 0
20:14:10.809 INFO BlockManagerInfo - Removed broadcast_327_piece0 on localhost:35739 in memory (size: 3.9 KiB, free: 1919.7 MiB)
20:14:10.810 INFO TaskSetManager - Starting task 0.0 in stage 169.0 (TID 225) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
20:14:10.810 INFO BlockManagerInfo - Removed broadcast_320_piece0 on localhost:35739 in memory (size: 56.4 KiB, free: 1919.7 MiB)
20:14:10.810 INFO Executor - Running task 0.0 in stage 169.0 (TID 225)
20:14:10.811 INFO BlockManagerInfo - Removed broadcast_324_piece0 on localhost:35739 in memory (size: 55.2 KiB, free: 1919.8 MiB)
20:14:10.811 INFO BlockManagerInfo - Removed broadcast_323_piece0 on localhost:35739 in memory (size: 54.7 KiB, free: 1919.8 MiB)
20:14:10.812 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
20:14:10.812 INFO BlockManagerInfo - Removed broadcast_319_piece0 on localhost:35739 in memory (size: 2.4 KiB, free: 1919.8 MiB)
20:14:10.812 INFO BlockManagerInfo - Removed broadcast_315_piece0 on localhost:35739 in memory (size: 3.9 KiB, free: 1919.8 MiB)
20:14:10.813 INFO BlockManagerInfo - Removed broadcast_318_piece0 on localhost:35739 in memory (size: 2.4 KiB, free: 1919.8 MiB)
20:14:10.814 INFO BlockManagerInfo - Removed broadcast_316_piece0 on localhost:35739 in memory (size: 2.4 KiB, free: 1919.8 MiB)
20:14:10.821 INFO Executor - Finished task 0.0 in stage 169.0 (TID 225). 192494 bytes result sent to driver
20:14:10.822 INFO TaskSetManager - Finished task 0.0 in stage 169.0 (TID 225) in 13 ms on localhost (executor driver) (1/1)
20:14:10.822 INFO TaskSchedulerImpl - Removed TaskSet 169.0, whose tasks have all completed, from pool
20:14:10.822 INFO DAGScheduler - ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94) finished in 0.020 s
20:14:10.822 INFO DAGScheduler - Job 124 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:10.822 INFO TaskSchedulerImpl - Killing all running tasks in stage 169: Stage finished
20:14:10.822 INFO DAGScheduler - Job 124 finished: collect at ReadsSparkSinkUnitTest.java:94, took 0.020028 s
20:14:10.831 INFO MemoryStore - Block broadcast_329 stored as values in memory (estimated size 297.9 KiB, free 1918.6 MiB)
20:14:10.837 INFO MemoryStore - Block broadcast_329_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.6 MiB)
20:14:10.837 INFO BlockManagerInfo - Added broadcast_329_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:14:10.837 INFO SparkContext - Created broadcast 329 from newAPIHadoopFile at PathSplitSource.java:96
20:14:10.859 INFO MemoryStore - Block broadcast_330 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:10.865 INFO MemoryStore - Block broadcast_330_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:10.865 INFO BlockManagerInfo - Added broadcast_330_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:10.865 INFO SparkContext - Created broadcast 330 from newAPIHadoopFile at PathSplitSource.java:96
20:14:10.885 INFO FileInputFormat - Total input files to process : 1
20:14:10.887 INFO MemoryStore - Block broadcast_331 stored as values in memory (estimated size 160.7 KiB, free 1918.1 MiB)
20:14:10.887 INFO MemoryStore - Block broadcast_331_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.1 MiB)
20:14:10.888 INFO BlockManagerInfo - Added broadcast_331_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.7 MiB)
20:14:10.888 INFO SparkContext - Created broadcast 331 from broadcast at ReadsSparkSink.java:133
20:14:10.889 INFO MemoryStore - Block broadcast_332 stored as values in memory (estimated size 163.2 KiB, free 1917.9 MiB)
20:14:10.890 INFO MemoryStore - Block broadcast_332_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
20:14:10.890 INFO BlockManagerInfo - Added broadcast_332_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.7 MiB)
20:14:10.890 INFO SparkContext - Created broadcast 332 from broadcast at BamSink.java:76
20:14:10.892 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:10.892 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:10.892 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:10.911 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:10.912 INFO DAGScheduler - Registering RDD 803 (mapToPair at SparkUtils.java:161) as input to shuffle 34
20:14:10.912 INFO DAGScheduler - Got job 125 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:10.912 INFO DAGScheduler - Final stage: ResultStage 171 (runJob at SparkHadoopWriter.scala:83)
20:14:10.912 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 170)
20:14:10.912 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 170)
20:14:10.912 INFO DAGScheduler - Submitting ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:10.935 INFO MemoryStore - Block broadcast_333 stored as values in memory (estimated size 520.4 KiB, free 1917.4 MiB)
20:14:10.937 INFO MemoryStore - Block broadcast_333_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.2 MiB)
20:14:10.937 INFO BlockManagerInfo - Added broadcast_333_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.6 MiB)
20:14:10.937 INFO SparkContext - Created broadcast 333 from broadcast at DAGScheduler.scala:1580
20:14:10.937 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:10.937 INFO TaskSchedulerImpl - Adding task set 170.0 with 1 tasks resource profile 0
20:14:10.938 INFO TaskSetManager - Starting task 0.0 in stage 170.0 (TID 226) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:10.938 INFO Executor - Running task 0.0 in stage 170.0 (TID 226)
20:14:10.968 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:10.984 INFO Executor - Finished task 0.0 in stage 170.0 (TID 226). 1148 bytes result sent to driver
20:14:10.984 INFO TaskSetManager - Finished task 0.0 in stage 170.0 (TID 226) in 46 ms on localhost (executor driver) (1/1)
20:14:10.984 INFO TaskSchedulerImpl - Removed TaskSet 170.0, whose tasks have all completed, from pool
20:14:10.984 INFO DAGScheduler - ShuffleMapStage 170 (mapToPair at SparkUtils.java:161) finished in 0.072 s
20:14:10.984 INFO DAGScheduler - looking for newly runnable stages
20:14:10.984 INFO DAGScheduler - running: HashSet()
20:14:10.984 INFO DAGScheduler - waiting: HashSet(ResultStage 171)
20:14:10.984 INFO DAGScheduler - failed: HashSet()
20:14:10.985 INFO DAGScheduler - Submitting ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91), which has no missing parents
20:14:10.991 INFO MemoryStore - Block broadcast_334 stored as values in memory (estimated size 241.4 KiB, free 1917.0 MiB)
20:14:10.992 INFO MemoryStore - Block broadcast_334_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.9 MiB)
20:14:10.992 INFO BlockManagerInfo - Added broadcast_334_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.5 MiB)
20:14:10.992 INFO SparkContext - Created broadcast 334 from broadcast at DAGScheduler.scala:1580
20:14:10.992 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:10.992 INFO TaskSchedulerImpl - Adding task set 171.0 with 1 tasks resource profile 0
20:14:10.993 INFO TaskSetManager - Starting task 0.0 in stage 171.0 (TID 227) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:10.993 INFO Executor - Running task 0.0 in stage 171.0 (TID 227)
20:14:10.997 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:10.997 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:11.008 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.008 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.008 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.008 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.008 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.008 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.033 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014104439992286219049889_0808_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace410529619220244390/_temporary/0/task_202502102014104439992286219049889_0808_r_000000
20:14:11.033 INFO SparkHadoopMapRedUtil - attempt_202502102014104439992286219049889_0808_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:11.033 INFO Executor - Finished task 0.0 in stage 171.0 (TID 227). 1858 bytes result sent to driver
20:14:11.034 INFO TaskSetManager - Finished task 0.0 in stage 171.0 (TID 227) in 41 ms on localhost (executor driver) (1/1)
20:14:11.034 INFO TaskSchedulerImpl - Removed TaskSet 171.0, whose tasks have all completed, from pool
20:14:11.034 INFO DAGScheduler - ResultStage 171 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
20:14:11.034 INFO DAGScheduler - Job 125 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.034 INFO TaskSchedulerImpl - Killing all running tasks in stage 171: Stage finished
20:14:11.034 INFO DAGScheduler - Job 125 finished: runJob at SparkHadoopWriter.scala:83, took 0.122606 s
20:14:11.034 INFO SparkHadoopWriter - Start to commit write Job job_202502102014104439992286219049889_0808.
20:14:11.039 INFO SparkHadoopWriter - Write Job job_202502102014104439992286219049889_0808 committed. Elapsed time: 4 ms.
20:14:11.051 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest13287620524284914351.bam
20:14:11.055 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest13287620524284914351.bam done
20:14:11.055 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace410529619220244390 to /tmp/ReadsSparkSinkUnitTest13287620524284914351.bam.sbi
20:14:11.059 INFO IndexFileMerger - Done merging .sbi files
20:14:11.059 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace410529619220244390 to /tmp/ReadsSparkSinkUnitTest13287620524284914351.bam.bai
20:14:11.064 INFO IndexFileMerger - Done merging .bai files
20:14:11.066 INFO MemoryStore - Block broadcast_335 stored as values in memory (estimated size 320.0 B, free 1916.9 MiB)
20:14:11.066 INFO MemoryStore - Block broadcast_335_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.9 MiB)
20:14:11.067 INFO BlockManagerInfo - Added broadcast_335_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.5 MiB)
20:14:11.067 INFO SparkContext - Created broadcast 335 from broadcast at BamSource.java:104
20:14:11.068 INFO MemoryStore - Block broadcast_336 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
20:14:11.074 INFO MemoryStore - Block broadcast_336_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
20:14:11.074 INFO BlockManagerInfo - Added broadcast_336_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:11.074 INFO SparkContext - Created broadcast 336 from newAPIHadoopFile at PathSplitSource.java:96
20:14:11.083 INFO FileInputFormat - Total input files to process : 1
20:14:11.097 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:11.097 INFO DAGScheduler - Got job 126 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:11.097 INFO DAGScheduler - Final stage: ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:11.097 INFO DAGScheduler - Parents of final stage: List()
20:14:11.097 INFO DAGScheduler - Missing parents: List()
20:14:11.097 INFO DAGScheduler - Submitting ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:11.103 INFO MemoryStore - Block broadcast_337 stored as values in memory (estimated size 148.2 KiB, free 1916.4 MiB)
20:14:11.104 INFO MemoryStore - Block broadcast_337_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.4 MiB)
20:14:11.104 INFO BlockManagerInfo - Added broadcast_337_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.4 MiB)
20:14:11.104 INFO SparkContext - Created broadcast 337 from broadcast at DAGScheduler.scala:1580
20:14:11.104 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:11.104 INFO TaskSchedulerImpl - Adding task set 172.0 with 1 tasks resource profile 0
20:14:11.105 INFO TaskSetManager - Starting task 0.0 in stage 172.0 (TID 228) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:11.105 INFO Executor - Running task 0.0 in stage 172.0 (TID 228)
20:14:11.116 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13287620524284914351.bam:0+237038
20:14:11.121 INFO Executor - Finished task 0.0 in stage 172.0 (TID 228). 651483 bytes result sent to driver
20:14:11.122 INFO TaskSetManager - Finished task 0.0 in stage 172.0 (TID 228) in 17 ms on localhost (executor driver) (1/1)
20:14:11.122 INFO TaskSchedulerImpl - Removed TaskSet 172.0, whose tasks have all completed, from pool
20:14:11.122 INFO DAGScheduler - ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
20:14:11.122 INFO DAGScheduler - Job 126 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.122 INFO TaskSchedulerImpl - Killing all running tasks in stage 172: Stage finished
20:14:11.122 INFO DAGScheduler - Job 126 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025578 s
20:14:11.135 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:11.136 INFO DAGScheduler - Got job 127 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:11.136 INFO DAGScheduler - Final stage: ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185)
20:14:11.136 INFO DAGScheduler - Parents of final stage: List()
20:14:11.136 INFO DAGScheduler - Missing parents: List()
20:14:11.136 INFO DAGScheduler - Submitting ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:11.153 INFO MemoryStore - Block broadcast_338 stored as values in memory (estimated size 426.1 KiB, free 1916.0 MiB)
20:14:11.154 INFO MemoryStore - Block broadcast_338_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.8 MiB)
20:14:11.154 INFO BlockManagerInfo - Added broadcast_338_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.2 MiB)
20:14:11.154 INFO SparkContext - Created broadcast 338 from broadcast at DAGScheduler.scala:1580
20:14:11.155 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:11.155 INFO TaskSchedulerImpl - Adding task set 173.0 with 1 tasks resource profile 0
20:14:11.155 INFO TaskSetManager - Starting task 0.0 in stage 173.0 (TID 229) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:11.155 INFO Executor - Running task 0.0 in stage 173.0 (TID 229)
20:14:11.183 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:11.193 INFO Executor - Finished task 0.0 in stage 173.0 (TID 229). 989 bytes result sent to driver
20:14:11.193 INFO TaskSetManager - Finished task 0.0 in stage 173.0 (TID 229) in 38 ms on localhost (executor driver) (1/1)
20:14:11.193 INFO TaskSchedulerImpl - Removed TaskSet 173.0, whose tasks have all completed, from pool
20:14:11.193 INFO DAGScheduler - ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:11.193 INFO DAGScheduler - Job 127 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.194 INFO TaskSchedulerImpl - Killing all running tasks in stage 173: Stage finished
20:14:11.194 INFO DAGScheduler - Job 127 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058424 s
20:14:11.197 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:11.197 INFO DAGScheduler - Got job 128 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:11.197 INFO DAGScheduler - Final stage: ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185)
20:14:11.197 INFO DAGScheduler - Parents of final stage: List()
20:14:11.197 INFO DAGScheduler - Missing parents: List()
20:14:11.197 INFO DAGScheduler - Submitting ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:11.204 INFO MemoryStore - Block broadcast_339 stored as values in memory (estimated size 148.1 KiB, free 1915.7 MiB)
20:14:11.204 INFO MemoryStore - Block broadcast_339_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.6 MiB)
20:14:11.204 INFO BlockManagerInfo - Added broadcast_339_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.2 MiB)
20:14:11.205 INFO SparkContext - Created broadcast 339 from broadcast at DAGScheduler.scala:1580
20:14:11.205 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:11.205 INFO TaskSchedulerImpl - Adding task set 174.0 with 1 tasks resource profile 0
20:14:11.205 INFO TaskSetManager - Starting task 0.0 in stage 174.0 (TID 230) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:11.205 INFO Executor - Running task 0.0 in stage 174.0 (TID 230)
20:14:11.220 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13287620524284914351.bam:0+237038
20:14:11.224 INFO Executor - Finished task 0.0 in stage 174.0 (TID 230). 989 bytes result sent to driver
20:14:11.224 INFO TaskSetManager - Finished task 0.0 in stage 174.0 (TID 230) in 19 ms on localhost (executor driver) (1/1)
20:14:11.224 INFO TaskSchedulerImpl - Removed TaskSet 174.0, whose tasks have all completed, from pool
20:14:11.224 INFO DAGScheduler - ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
20:14:11.224 INFO DAGScheduler - Job 128 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.224 INFO TaskSchedulerImpl - Killing all running tasks in stage 174: Stage finished
20:14:11.224 INFO DAGScheduler - Job 128 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027480 s
20:14:11.234 INFO MemoryStore - Block broadcast_340 stored as values in memory (estimated size 297.9 KiB, free 1915.3 MiB)
20:14:11.240 INFO MemoryStore - Block broadcast_340_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
20:14:11.240 INFO BlockManagerInfo - Added broadcast_340_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:14:11.240 INFO SparkContext - Created broadcast 340 from newAPIHadoopFile at PathSplitSource.java:96
20:14:11.261 INFO MemoryStore - Block broadcast_341 stored as values in memory (estimated size 297.9 KiB, free 1915.0 MiB)
20:14:11.267 INFO BlockManagerInfo - Removed broadcast_325_piece0 on localhost:35739 in memory (size: 2.4 KiB, free: 1919.1 MiB)
20:14:11.268 INFO BlockManagerInfo - Removed broadcast_321_piece0 on localhost:35739 in memory (size: 127.0 B, free: 1919.1 MiB)
20:14:11.268 INFO BlockManagerInfo - Removed broadcast_339_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.2 MiB)
20:14:11.268 INFO BlockManagerInfo - Removed broadcast_328_piece0 on localhost:35739 in memory (size: 4.4 KiB, free: 1919.2 MiB)
20:14:11.269 INFO BlockManagerInfo - Removed broadcast_330_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:11.269 INFO BlockManagerInfo - Removed broadcast_334_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.3 MiB)
20:14:11.270 INFO BlockManagerInfo - Removed broadcast_338_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.5 MiB)
20:14:11.270 INFO BlockManagerInfo - Removed broadcast_336_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:11.271 INFO BlockManagerInfo - Removed broadcast_331_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:11.271 INFO BlockManagerInfo - Removed broadcast_332_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:11.271 INFO BlockManagerInfo - Removed broadcast_337_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.6 MiB)
20:14:11.272 INFO BlockManagerInfo - Removed broadcast_329_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:11.272 INFO BlockManagerInfo - Removed broadcast_313_piece0 on localhost:35739 in memory (size: 2.4 KiB, free: 1919.6 MiB)
20:14:11.272 INFO BlockManagerInfo - Removed broadcast_314_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.7 MiB)
20:14:11.273 INFO BlockManagerInfo - Removed broadcast_333_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.9 MiB)
20:14:11.273 INFO BlockManagerInfo - Removed broadcast_335_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.9 MiB)
20:14:11.274 INFO MemoryStore - Block broadcast_341_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.0 MiB)
20:14:11.274 INFO BlockManagerInfo - Removed broadcast_326_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.9 MiB)
20:14:11.274 INFO BlockManagerInfo - Added broadcast_341_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:11.274 INFO SparkContext - Created broadcast 341 from newAPIHadoopFile at PathSplitSource.java:96
20:14:11.274 INFO BlockManagerInfo - Removed broadcast_322_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:11.294 INFO FileInputFormat - Total input files to process : 1
20:14:11.296 INFO MemoryStore - Block broadcast_342 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
20:14:11.297 INFO MemoryStore - Block broadcast_342_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
20:14:11.297 INFO BlockManagerInfo - Added broadcast_342_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:14:11.297 INFO SparkContext - Created broadcast 342 from broadcast at ReadsSparkSink.java:133
20:14:11.298 INFO MemoryStore - Block broadcast_343 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
20:14:11.299 INFO MemoryStore - Block broadcast_343_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
20:14:11.299 INFO BlockManagerInfo - Added broadcast_343_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.9 MiB)
20:14:11.299 INFO SparkContext - Created broadcast 343 from broadcast at BamSink.java:76
20:14:11.301 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.301 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.301 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.318 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:11.318 INFO DAGScheduler - Registering RDD 828 (mapToPair at SparkUtils.java:161) as input to shuffle 35
20:14:11.319 INFO DAGScheduler - Got job 129 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:11.319 INFO DAGScheduler - Final stage: ResultStage 176 (runJob at SparkHadoopWriter.scala:83)
20:14:11.319 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 175)
20:14:11.319 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 175)
20:14:11.319 INFO DAGScheduler - Submitting ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:11.336 INFO MemoryStore - Block broadcast_344 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
20:14:11.337 INFO MemoryStore - Block broadcast_344_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
20:14:11.337 INFO BlockManagerInfo - Added broadcast_344_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.7 MiB)
20:14:11.337 INFO SparkContext - Created broadcast 344 from broadcast at DAGScheduler.scala:1580
20:14:11.338 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:11.338 INFO TaskSchedulerImpl - Adding task set 175.0 with 1 tasks resource profile 0
20:14:11.338 INFO TaskSetManager - Starting task 0.0 in stage 175.0 (TID 231) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:11.338 INFO Executor - Running task 0.0 in stage 175.0 (TID 231)
20:14:11.374 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:11.389 INFO Executor - Finished task 0.0 in stage 175.0 (TID 231). 1148 bytes result sent to driver
20:14:11.390 INFO TaskSetManager - Finished task 0.0 in stage 175.0 (TID 231) in 52 ms on localhost (executor driver) (1/1)
20:14:11.390 INFO TaskSchedulerImpl - Removed TaskSet 175.0, whose tasks have all completed, from pool
20:14:11.390 INFO DAGScheduler - ShuffleMapStage 175 (mapToPair at SparkUtils.java:161) finished in 0.071 s
20:14:11.390 INFO DAGScheduler - looking for newly runnable stages
20:14:11.390 INFO DAGScheduler - running: HashSet()
20:14:11.390 INFO DAGScheduler - waiting: HashSet(ResultStage 176)
20:14:11.390 INFO DAGScheduler - failed: HashSet()
20:14:11.390 INFO DAGScheduler - Submitting ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91), which has no missing parents
20:14:11.397 INFO MemoryStore - Block broadcast_345 stored as values in memory (estimated size 241.4 KiB, free 1918.1 MiB)
20:14:11.398 INFO MemoryStore - Block broadcast_345_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.0 MiB)
20:14:11.398 INFO BlockManagerInfo - Added broadcast_345_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.7 MiB)
20:14:11.398 INFO SparkContext - Created broadcast 345 from broadcast at DAGScheduler.scala:1580
20:14:11.398 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:11.398 INFO TaskSchedulerImpl - Adding task set 176.0 with 1 tasks resource profile 0
20:14:11.399 INFO TaskSetManager - Starting task 0.0 in stage 176.0 (TID 232) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:11.399 INFO Executor - Running task 0.0 in stage 176.0 (TID 232)
20:14:11.403 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:11.403 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:11.414 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.414 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.414 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.414 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.414 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.414 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.438 INFO FileOutputCommitter - Saved output of task 'attempt_20250210201411487151714034440733_0833_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace10630735768816226208/_temporary/0/task_20250210201411487151714034440733_0833_r_000000
20:14:11.438 INFO SparkHadoopMapRedUtil - attempt_20250210201411487151714034440733_0833_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:11.439 INFO Executor - Finished task 0.0 in stage 176.0 (TID 232). 1858 bytes result sent to driver
20:14:11.439 INFO TaskSetManager - Finished task 0.0 in stage 176.0 (TID 232) in 41 ms on localhost (executor driver) (1/1)
20:14:11.439 INFO TaskSchedulerImpl - Removed TaskSet 176.0, whose tasks have all completed, from pool
20:14:11.440 INFO DAGScheduler - ResultStage 176 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
20:14:11.440 INFO DAGScheduler - Job 129 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.440 INFO TaskSchedulerImpl - Killing all running tasks in stage 176: Stage finished
20:14:11.440 INFO DAGScheduler - Job 129 finished: runJob at SparkHadoopWriter.scala:83, took 0.121622 s
20:14:11.440 INFO SparkHadoopWriter - Start to commit write Job job_20250210201411487151714034440733_0833.
20:14:11.445 INFO SparkHadoopWriter - Write Job job_20250210201411487151714034440733_0833 committed. Elapsed time: 4 ms.
20:14:11.457 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest15253973540698848114.bam
20:14:11.461 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest15253973540698848114.bam done
20:14:11.461 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace10630735768816226208 to /tmp/ReadsSparkSinkUnitTest15253973540698848114.bam.sbi
20:14:11.466 INFO IndexFileMerger - Done merging .sbi files
20:14:11.466 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace10630735768816226208 to /tmp/ReadsSparkSinkUnitTest15253973540698848114.bam.bai
20:14:11.471 INFO IndexFileMerger - Done merging .bai files
20:14:11.473 INFO MemoryStore - Block broadcast_346 stored as values in memory (estimated size 13.3 KiB, free 1918.0 MiB)
20:14:11.474 INFO MemoryStore - Block broadcast_346_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.0 MiB)
20:14:11.474 INFO BlockManagerInfo - Added broadcast_346_piece0 in memory on localhost:35739 (size: 8.3 KiB, free: 1919.6 MiB)
20:14:11.474 INFO SparkContext - Created broadcast 346 from broadcast at BamSource.java:104
20:14:11.475 INFO MemoryStore - Block broadcast_347 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
20:14:11.481 INFO MemoryStore - Block broadcast_347_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
20:14:11.481 INFO BlockManagerInfo - Added broadcast_347_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:11.481 INFO SparkContext - Created broadcast 347 from newAPIHadoopFile at PathSplitSource.java:96
20:14:11.490 INFO FileInputFormat - Total input files to process : 1
20:14:11.503 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:11.504 INFO DAGScheduler - Got job 130 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:11.504 INFO DAGScheduler - Final stage: ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:11.504 INFO DAGScheduler - Parents of final stage: List()
20:14:11.504 INFO DAGScheduler - Missing parents: List()
20:14:11.504 INFO DAGScheduler - Submitting ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:11.510 INFO MemoryStore - Block broadcast_348 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
20:14:11.510 INFO MemoryStore - Block broadcast_348_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
20:14:11.510 INFO BlockManagerInfo - Added broadcast_348_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.5 MiB)
20:14:11.511 INFO SparkContext - Created broadcast 348 from broadcast at DAGScheduler.scala:1580
20:14:11.511 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:11.511 INFO TaskSchedulerImpl - Adding task set 177.0 with 1 tasks resource profile 0
20:14:11.511 INFO TaskSetManager - Starting task 0.0 in stage 177.0 (TID 233) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:11.512 INFO Executor - Running task 0.0 in stage 177.0 (TID 233)
20:14:11.522 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest15253973540698848114.bam:0+237038
20:14:11.527 INFO Executor - Finished task 0.0 in stage 177.0 (TID 233). 651483 bytes result sent to driver
20:14:11.528 INFO TaskSetManager - Finished task 0.0 in stage 177.0 (TID 233) in 17 ms on localhost (executor driver) (1/1)
20:14:11.528 INFO TaskSchedulerImpl - Removed TaskSet 177.0, whose tasks have all completed, from pool
20:14:11.528 INFO DAGScheduler - ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.024 s
20:14:11.528 INFO DAGScheduler - Job 130 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.528 INFO TaskSchedulerImpl - Killing all running tasks in stage 177: Stage finished
20:14:11.528 INFO DAGScheduler - Job 130 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.024894 s
20:14:11.537 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:11.538 INFO DAGScheduler - Got job 131 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:11.538 INFO DAGScheduler - Final stage: ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185)
20:14:11.538 INFO DAGScheduler - Parents of final stage: List()
20:14:11.538 INFO DAGScheduler - Missing parents: List()
20:14:11.538 INFO DAGScheduler - Submitting ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:11.554 INFO MemoryStore - Block broadcast_349 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
20:14:11.555 INFO MemoryStore - Block broadcast_349_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
20:14:11.555 INFO BlockManagerInfo - Added broadcast_349_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.4 MiB)
20:14:11.556 INFO SparkContext - Created broadcast 349 from broadcast at DAGScheduler.scala:1580
20:14:11.556 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:11.556 INFO TaskSchedulerImpl - Adding task set 178.0 with 1 tasks resource profile 0
20:14:11.556 INFO TaskSetManager - Starting task 0.0 in stage 178.0 (TID 234) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:11.557 INFO Executor - Running task 0.0 in stage 178.0 (TID 234)
20:14:11.585 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:11.594 INFO Executor - Finished task 0.0 in stage 178.0 (TID 234). 989 bytes result sent to driver
20:14:11.594 INFO TaskSetManager - Finished task 0.0 in stage 178.0 (TID 234) in 38 ms on localhost (executor driver) (1/1)
20:14:11.595 INFO TaskSchedulerImpl - Removed TaskSet 178.0, whose tasks have all completed, from pool
20:14:11.595 INFO DAGScheduler - ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:11.595 INFO DAGScheduler - Job 131 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.595 INFO TaskSchedulerImpl - Killing all running tasks in stage 178: Stage finished
20:14:11.595 INFO DAGScheduler - Job 131 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057536 s
20:14:11.598 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:11.598 INFO DAGScheduler - Got job 132 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:11.598 INFO DAGScheduler - Final stage: ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185)
20:14:11.598 INFO DAGScheduler - Parents of final stage: List()
20:14:11.598 INFO DAGScheduler - Missing parents: List()
20:14:11.598 INFO DAGScheduler - Submitting ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:11.604 INFO MemoryStore - Block broadcast_350 stored as values in memory (estimated size 148.1 KiB, free 1916.7 MiB)
20:14:11.605 INFO MemoryStore - Block broadcast_350_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1916.7 MiB)
20:14:11.605 INFO BlockManagerInfo - Added broadcast_350_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.3 MiB)
20:14:11.605 INFO SparkContext - Created broadcast 350 from broadcast at DAGScheduler.scala:1580
20:14:11.606 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:11.606 INFO TaskSchedulerImpl - Adding task set 179.0 with 1 tasks resource profile 0
20:14:11.606 INFO TaskSetManager - Starting task 0.0 in stage 179.0 (TID 235) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:11.606 INFO Executor - Running task 0.0 in stage 179.0 (TID 235)
20:14:11.617 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest15253973540698848114.bam:0+237038
20:14:11.620 INFO Executor - Finished task 0.0 in stage 179.0 (TID 235). 989 bytes result sent to driver
20:14:11.621 INFO TaskSetManager - Finished task 0.0 in stage 179.0 (TID 235) in 15 ms on localhost (executor driver) (1/1)
20:14:11.621 INFO TaskSchedulerImpl - Removed TaskSet 179.0, whose tasks have all completed, from pool
20:14:11.621 INFO DAGScheduler - ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
20:14:11.621 INFO DAGScheduler - Job 132 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.621 INFO TaskSchedulerImpl - Killing all running tasks in stage 179: Stage finished
20:14:11.621 INFO DAGScheduler - Job 132 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022771 s
20:14:11.629 INFO MemoryStore - Block broadcast_351 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:14:11.635 INFO MemoryStore - Block broadcast_351_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:11.635 INFO BlockManagerInfo - Added broadcast_351_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:11.635 INFO SparkContext - Created broadcast 351 from newAPIHadoopFile at PathSplitSource.java:96
20:14:11.656 INFO MemoryStore - Block broadcast_352 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
20:14:11.662 INFO MemoryStore - Block broadcast_352_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
20:14:11.662 INFO BlockManagerInfo - Added broadcast_352_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:11.662 INFO SparkContext - Created broadcast 352 from newAPIHadoopFile at PathSplitSource.java:96
20:14:11.682 INFO FileInputFormat - Total input files to process : 1
20:14:11.683 INFO MemoryStore - Block broadcast_353 stored as values in memory (estimated size 160.7 KiB, free 1915.9 MiB)
20:14:11.684 INFO MemoryStore - Block broadcast_353_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.8 MiB)
20:14:11.684 INFO BlockManagerInfo - Added broadcast_353_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.2 MiB)
20:14:11.684 INFO SparkContext - Created broadcast 353 from broadcast at ReadsSparkSink.java:133
20:14:11.685 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:11.685 INFO MemoryStore - Block broadcast_354 stored as values in memory (estimated size 163.2 KiB, free 1915.7 MiB)
20:14:11.690 INFO BlockManagerInfo - Removed broadcast_343_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.2 MiB)
20:14:11.690 INFO MemoryStore - Block broadcast_354_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.8 MiB)
20:14:11.691 INFO BlockManagerInfo - Added broadcast_354_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.2 MiB)
20:14:11.691 INFO BlockManagerInfo - Removed broadcast_352_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:11.691 INFO SparkContext - Created broadcast 354 from broadcast at BamSink.java:76
20:14:11.691 INFO BlockManagerInfo - Removed broadcast_347_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:11.692 INFO BlockManagerInfo - Removed broadcast_349_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.5 MiB)
20:14:11.692 INFO BlockManagerInfo - Removed broadcast_341_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:11.692 INFO BlockManagerInfo - Removed broadcast_348_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.6 MiB)
20:14:11.693 INFO BlockManagerInfo - Removed broadcast_342_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.6 MiB)
20:14:11.693 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.693 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.693 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.693 INFO BlockManagerInfo - Removed broadcast_340_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:11.694 INFO BlockManagerInfo - Removed broadcast_350_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.7 MiB)
20:14:11.695 INFO BlockManagerInfo - Removed broadcast_345_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.8 MiB)
20:14:11.695 INFO BlockManagerInfo - Removed broadcast_346_piece0 on localhost:35739 in memory (size: 8.3 KiB, free: 1919.8 MiB)
20:14:11.696 INFO BlockManagerInfo - Removed broadcast_344_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.9 MiB)
20:14:11.711 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:11.712 INFO DAGScheduler - Registering RDD 853 (mapToPair at SparkUtils.java:161) as input to shuffle 36
20:14:11.712 INFO DAGScheduler - Got job 133 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:11.712 INFO DAGScheduler - Final stage: ResultStage 181 (runJob at SparkHadoopWriter.scala:83)
20:14:11.712 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 180)
20:14:11.712 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 180)
20:14:11.712 INFO DAGScheduler - Submitting ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:11.736 INFO MemoryStore - Block broadcast_355 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
20:14:11.738 INFO MemoryStore - Block broadcast_355_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
20:14:11.738 INFO BlockManagerInfo - Added broadcast_355_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.8 MiB)
20:14:11.738 INFO SparkContext - Created broadcast 355 from broadcast at DAGScheduler.scala:1580
20:14:11.738 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:11.738 INFO TaskSchedulerImpl - Adding task set 180.0 with 1 tasks resource profile 0
20:14:11.739 INFO TaskSetManager - Starting task 0.0 in stage 180.0 (TID 236) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:11.739 INFO Executor - Running task 0.0 in stage 180.0 (TID 236)
20:14:11.768 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:11.788 INFO Executor - Finished task 0.0 in stage 180.0 (TID 236). 1148 bytes result sent to driver
20:14:11.789 INFO TaskSetManager - Finished task 0.0 in stage 180.0 (TID 236) in 51 ms on localhost (executor driver) (1/1)
20:14:11.789 INFO TaskSchedulerImpl - Removed TaskSet 180.0, whose tasks have all completed, from pool
20:14:11.789 INFO DAGScheduler - ShuffleMapStage 180 (mapToPair at SparkUtils.java:161) finished in 0.077 s
20:14:11.789 INFO DAGScheduler - looking for newly runnable stages
20:14:11.789 INFO DAGScheduler - running: HashSet()
20:14:11.789 INFO DAGScheduler - waiting: HashSet(ResultStage 181)
20:14:11.789 INFO DAGScheduler - failed: HashSet()
20:14:11.789 INFO DAGScheduler - Submitting ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91), which has no missing parents
20:14:11.796 INFO MemoryStore - Block broadcast_356 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
20:14:11.797 INFO MemoryStore - Block broadcast_356_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
20:14:11.797 INFO BlockManagerInfo - Added broadcast_356_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.7 MiB)
20:14:11.797 INFO SparkContext - Created broadcast 356 from broadcast at DAGScheduler.scala:1580
20:14:11.797 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:11.797 INFO TaskSchedulerImpl - Adding task set 181.0 with 1 tasks resource profile 0
20:14:11.798 INFO TaskSetManager - Starting task 0.0 in stage 181.0 (TID 237) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:11.798 INFO Executor - Running task 0.0 in stage 181.0 (TID 237)
20:14:11.802 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:11.802 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:11.814 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.814 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.814 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.815 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:11.815 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:11.815 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:11.834 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014119121619214109931558_0858_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace3134621325458062900/_temporary/0/task_202502102014119121619214109931558_0858_r_000000
20:14:11.834 INFO SparkHadoopMapRedUtil - attempt_202502102014119121619214109931558_0858_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:11.834 INFO Executor - Finished task 0.0 in stage 181.0 (TID 237). 1858 bytes result sent to driver
20:14:11.835 INFO TaskSetManager - Finished task 0.0 in stage 181.0 (TID 237) in 38 ms on localhost (executor driver) (1/1)
20:14:11.835 INFO TaskSchedulerImpl - Removed TaskSet 181.0, whose tasks have all completed, from pool
20:14:11.835 INFO DAGScheduler - ResultStage 181 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
20:14:11.835 INFO DAGScheduler - Job 133 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.835 INFO TaskSchedulerImpl - Killing all running tasks in stage 181: Stage finished
20:14:11.835 INFO DAGScheduler - Job 133 finished: runJob at SparkHadoopWriter.scala:83, took 0.123757 s
20:14:11.835 INFO SparkHadoopWriter - Start to commit write Job job_202502102014119121619214109931558_0858.
20:14:11.840 INFO SparkHadoopWriter - Write Job job_202502102014119121619214109931558_0858 committed. Elapsed time: 4 ms.
20:14:11.851 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest17155257887459871649.bam
20:14:11.855 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest17155257887459871649.bam done
20:14:11.855 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace3134621325458062900 to /tmp/ReadsSparkSinkUnitTest17155257887459871649.bam.bai
20:14:11.860 INFO IndexFileMerger - Done merging .bai files
20:14:11.862 INFO MemoryStore - Block broadcast_357 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:11.868 INFO MemoryStore - Block broadcast_357_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:11.869 INFO BlockManagerInfo - Added broadcast_357_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:11.869 INFO SparkContext - Created broadcast 357 from newAPIHadoopFile at PathSplitSource.java:96
20:14:11.888 INFO FileInputFormat - Total input files to process : 1
20:14:11.923 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:11.923 INFO DAGScheduler - Got job 134 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:11.923 INFO DAGScheduler - Final stage: ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:11.923 INFO DAGScheduler - Parents of final stage: List()
20:14:11.923 INFO DAGScheduler - Missing parents: List()
20:14:11.924 INFO DAGScheduler - Submitting ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:11.940 INFO MemoryStore - Block broadcast_358 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
20:14:11.941 INFO MemoryStore - Block broadcast_358_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
20:14:11.941 INFO BlockManagerInfo - Added broadcast_358_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:11.941 INFO SparkContext - Created broadcast 358 from broadcast at DAGScheduler.scala:1580
20:14:11.942 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:11.942 INFO TaskSchedulerImpl - Adding task set 182.0 with 1 tasks resource profile 0
20:14:11.942 INFO TaskSetManager - Starting task 0.0 in stage 182.0 (TID 238) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:11.942 INFO Executor - Running task 0.0 in stage 182.0 (TID 238)
20:14:11.970 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17155257887459871649.bam:0+237038
20:14:11.982 INFO Executor - Finished task 0.0 in stage 182.0 (TID 238). 651483 bytes result sent to driver
20:14:11.985 INFO TaskSetManager - Finished task 0.0 in stage 182.0 (TID 238) in 43 ms on localhost (executor driver) (1/1)
20:14:11.985 INFO TaskSchedulerImpl - Removed TaskSet 182.0, whose tasks have all completed, from pool
20:14:11.985 INFO DAGScheduler - ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.061 s
20:14:11.985 INFO DAGScheduler - Job 134 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:11.985 INFO TaskSchedulerImpl - Killing all running tasks in stage 182: Stage finished
20:14:11.985 INFO DAGScheduler - Job 134 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.062056 s
20:14:12.000 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:12.001 INFO DAGScheduler - Got job 135 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:12.001 INFO DAGScheduler - Final stage: ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185)
20:14:12.001 INFO DAGScheduler - Parents of final stage: List()
20:14:12.001 INFO DAGScheduler - Missing parents: List()
20:14:12.001 INFO DAGScheduler - Submitting ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.017 INFO MemoryStore - Block broadcast_359 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
20:14:12.019 INFO MemoryStore - Block broadcast_359_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
20:14:12.019 INFO BlockManagerInfo - Added broadcast_359_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.4 MiB)
20:14:12.019 INFO SparkContext - Created broadcast 359 from broadcast at DAGScheduler.scala:1580
20:14:12.019 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.019 INFO TaskSchedulerImpl - Adding task set 183.0 with 1 tasks resource profile 0
20:14:12.020 INFO TaskSetManager - Starting task 0.0 in stage 183.0 (TID 239) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:12.020 INFO Executor - Running task 0.0 in stage 183.0 (TID 239)
20:14:12.048 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:12.058 INFO Executor - Finished task 0.0 in stage 183.0 (TID 239). 989 bytes result sent to driver
20:14:12.058 INFO TaskSetManager - Finished task 0.0 in stage 183.0 (TID 239) in 39 ms on localhost (executor driver) (1/1)
20:14:12.058 INFO TaskSchedulerImpl - Removed TaskSet 183.0, whose tasks have all completed, from pool
20:14:12.058 INFO DAGScheduler - ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:12.058 INFO DAGScheduler - Job 135 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.058 INFO TaskSchedulerImpl - Killing all running tasks in stage 183: Stage finished
20:14:12.058 INFO DAGScheduler - Job 135 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058155 s
20:14:12.062 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:12.062 INFO DAGScheduler - Got job 136 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:12.062 INFO DAGScheduler - Final stage: ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185)
20:14:12.062 INFO DAGScheduler - Parents of final stage: List()
20:14:12.062 INFO DAGScheduler - Missing parents: List()
20:14:12.062 INFO DAGScheduler - Submitting ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.078 INFO MemoryStore - Block broadcast_360 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
20:14:12.080 INFO MemoryStore - Block broadcast_360_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
20:14:12.080 INFO BlockManagerInfo - Added broadcast_360_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.2 MiB)
20:14:12.080 INFO SparkContext - Created broadcast 360 from broadcast at DAGScheduler.scala:1580
20:14:12.080 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.080 INFO TaskSchedulerImpl - Adding task set 184.0 with 1 tasks resource profile 0
20:14:12.081 INFO TaskSetManager - Starting task 0.0 in stage 184.0 (TID 240) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:12.081 INFO Executor - Running task 0.0 in stage 184.0 (TID 240)
20:14:12.109 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17155257887459871649.bam:0+237038
20:14:12.120 INFO Executor - Finished task 0.0 in stage 184.0 (TID 240). 989 bytes result sent to driver
20:14:12.120 INFO TaskSetManager - Finished task 0.0 in stage 184.0 (TID 240) in 40 ms on localhost (executor driver) (1/1)
20:14:12.120 INFO TaskSchedulerImpl - Removed TaskSet 184.0, whose tasks have all completed, from pool
20:14:12.120 INFO DAGScheduler - ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
20:14:12.120 INFO DAGScheduler - Job 136 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.121 INFO TaskSchedulerImpl - Killing all running tasks in stage 184: Stage finished
20:14:12.121 INFO DAGScheduler - Job 136 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058819 s
20:14:12.129 INFO MemoryStore - Block broadcast_361 stored as values in memory (estimated size 297.9 KiB, free 1916.0 MiB)
20:14:12.135 INFO MemoryStore - Block broadcast_361_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
20:14:12.135 INFO BlockManagerInfo - Added broadcast_361_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:12.135 INFO SparkContext - Created broadcast 361 from newAPIHadoopFile at PathSplitSource.java:96
20:14:12.156 INFO MemoryStore - Block broadcast_362 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
20:14:12.162 INFO MemoryStore - Block broadcast_362_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
20:14:12.163 INFO BlockManagerInfo - Added broadcast_362_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:14:12.163 INFO SparkContext - Created broadcast 362 from newAPIHadoopFile at PathSplitSource.java:96
20:14:12.182 INFO FileInputFormat - Total input files to process : 1
20:14:12.184 INFO MemoryStore - Block broadcast_363 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
20:14:12.185 INFO MemoryStore - Block broadcast_363_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
20:14:12.185 INFO BlockManagerInfo - Added broadcast_363_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.1 MiB)
20:14:12.185 INFO SparkContext - Created broadcast 363 from broadcast at ReadsSparkSink.java:133
20:14:12.185 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:12.186 INFO MemoryStore - Block broadcast_364 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
20:14:12.187 INFO MemoryStore - Block broadcast_364_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
20:14:12.187 INFO BlockManagerInfo - Added broadcast_364_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.1 MiB)
20:14:12.187 INFO SparkContext - Created broadcast 364 from broadcast at BamSink.java:76
20:14:12.189 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:12.189 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:12.189 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:12.206 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:12.206 INFO DAGScheduler - Registering RDD 879 (mapToPair at SparkUtils.java:161) as input to shuffle 37
20:14:12.206 INFO DAGScheduler - Got job 137 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:12.206 INFO DAGScheduler - Final stage: ResultStage 186 (runJob at SparkHadoopWriter.scala:83)
20:14:12.206 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 185)
20:14:12.206 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 185)
20:14:12.207 INFO DAGScheduler - Submitting ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:12.223 INFO MemoryStore - Block broadcast_365 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
20:14:12.225 INFO MemoryStore - Block broadcast_365_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
20:14:12.225 INFO BlockManagerInfo - Added broadcast_365_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1918.9 MiB)
20:14:12.225 INFO SparkContext - Created broadcast 365 from broadcast at DAGScheduler.scala:1580
20:14:12.225 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:12.225 INFO TaskSchedulerImpl - Adding task set 185.0 with 1 tasks resource profile 0
20:14:12.226 INFO TaskSetManager - Starting task 0.0 in stage 185.0 (TID 241) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:12.226 INFO Executor - Running task 0.0 in stage 185.0 (TID 241)
20:14:12.255 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:12.270 INFO Executor - Finished task 0.0 in stage 185.0 (TID 241). 1148 bytes result sent to driver
20:14:12.270 INFO TaskSetManager - Finished task 0.0 in stage 185.0 (TID 241) in 44 ms on localhost (executor driver) (1/1)
20:14:12.270 INFO TaskSchedulerImpl - Removed TaskSet 185.0, whose tasks have all completed, from pool
20:14:12.271 INFO DAGScheduler - ShuffleMapStage 185 (mapToPair at SparkUtils.java:161) finished in 0.063 s
20:14:12.271 INFO DAGScheduler - looking for newly runnable stages
20:14:12.271 INFO DAGScheduler - running: HashSet()
20:14:12.271 INFO DAGScheduler - waiting: HashSet(ResultStage 186)
20:14:12.271 INFO DAGScheduler - failed: HashSet()
20:14:12.271 INFO DAGScheduler - Submitting ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91), which has no missing parents
20:14:12.277 INFO MemoryStore - Block broadcast_366 stored as values in memory (estimated size 241.4 KiB, free 1914.4 MiB)
20:14:12.282 INFO BlockManagerInfo - Removed broadcast_353_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1918.9 MiB)
20:14:12.282 INFO MemoryStore - Block broadcast_366_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1914.5 MiB)
20:14:12.282 INFO BlockManagerInfo - Added broadcast_366_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1918.9 MiB)
20:14:12.282 INFO BlockManagerInfo - Removed broadcast_359_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.0 MiB)
20:14:12.282 INFO SparkContext - Created broadcast 366 from broadcast at DAGScheduler.scala:1580
20:14:12.282 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:12.282 INFO TaskSchedulerImpl - Adding task set 186.0 with 1 tasks resource profile 0
20:14:12.282 INFO BlockManagerInfo - Removed broadcast_360_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.2 MiB)
20:14:12.283 INFO BlockManagerInfo - Removed broadcast_356_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.2 MiB)
20:14:12.283 INFO TaskSetManager - Starting task 0.0 in stage 186.0 (TID 242) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:12.283 INFO Executor - Running task 0.0 in stage 186.0 (TID 242)
20:14:12.283 INFO BlockManagerInfo - Removed broadcast_354_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.2 MiB)
20:14:12.284 INFO BlockManagerInfo - Removed broadcast_362_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:12.285 INFO BlockManagerInfo - Removed broadcast_355_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.5 MiB)
20:14:12.286 INFO BlockManagerInfo - Removed broadcast_357_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:12.286 INFO BlockManagerInfo - Removed broadcast_351_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:12.287 INFO BlockManagerInfo - Removed broadcast_358_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:12.289 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:12.289 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:12.300 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:12.300 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:12.300 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:12.300 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:12.300 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:12.300 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:12.318 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014121330075247007006951_0884_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace9039131044655928289/_temporary/0/task_202502102014121330075247007006951_0884_r_000000
20:14:12.318 INFO SparkHadoopMapRedUtil - attempt_202502102014121330075247007006951_0884_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:12.319 INFO Executor - Finished task 0.0 in stage 186.0 (TID 242). 1858 bytes result sent to driver
20:14:12.319 INFO TaskSetManager - Finished task 0.0 in stage 186.0 (TID 242) in 36 ms on localhost (executor driver) (1/1)
20:14:12.319 INFO TaskSchedulerImpl - Removed TaskSet 186.0, whose tasks have all completed, from pool
20:14:12.319 INFO DAGScheduler - ResultStage 186 (runJob at SparkHadoopWriter.scala:83) finished in 0.048 s
20:14:12.319 INFO DAGScheduler - Job 137 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.319 INFO TaskSchedulerImpl - Killing all running tasks in stage 186: Stage finished
20:14:12.320 INFO DAGScheduler - Job 137 finished: runJob at SparkHadoopWriter.scala:83, took 0.113650 s
20:14:12.320 INFO SparkHadoopWriter - Start to commit write Job job_202502102014121330075247007006951_0884.
20:14:12.324 INFO SparkHadoopWriter - Write Job job_202502102014121330075247007006951_0884 committed. Elapsed time: 4 ms.
20:14:12.336 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest19370255124586567562.bam
20:14:12.340 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest19370255124586567562.bam done
20:14:12.340 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace9039131044655928289 to /tmp/ReadsSparkSinkUnitTest19370255124586567562.bam.sbi
20:14:12.344 INFO IndexFileMerger - Done merging .sbi files
20:14:12.346 INFO MemoryStore - Block broadcast_367 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
20:14:12.346 INFO MemoryStore - Block broadcast_367_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
20:14:12.346 INFO BlockManagerInfo - Added broadcast_367_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.7 MiB)
20:14:12.346 INFO SparkContext - Created broadcast 367 from broadcast at BamSource.java:104
20:14:12.347 INFO MemoryStore - Block broadcast_368 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:12.353 INFO MemoryStore - Block broadcast_368_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:12.353 INFO BlockManagerInfo - Added broadcast_368_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:12.354 INFO SparkContext - Created broadcast 368 from newAPIHadoopFile at PathSplitSource.java:96
20:14:12.362 INFO FileInputFormat - Total input files to process : 1
20:14:12.376 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:12.376 INFO DAGScheduler - Got job 138 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:12.376 INFO DAGScheduler - Final stage: ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:12.376 INFO DAGScheduler - Parents of final stage: List()
20:14:12.377 INFO DAGScheduler - Missing parents: List()
20:14:12.377 INFO DAGScheduler - Submitting ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.382 INFO MemoryStore - Block broadcast_369 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
20:14:12.383 INFO MemoryStore - Block broadcast_369_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
20:14:12.383 INFO BlockManagerInfo - Added broadcast_369_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:12.383 INFO SparkContext - Created broadcast 369 from broadcast at DAGScheduler.scala:1580
20:14:12.383 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.383 INFO TaskSchedulerImpl - Adding task set 187.0 with 1 tasks resource profile 0
20:14:12.384 INFO TaskSetManager - Starting task 0.0 in stage 187.0 (TID 243) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:12.384 INFO Executor - Running task 0.0 in stage 187.0 (TID 243)
20:14:12.395 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19370255124586567562.bam:0+237038
20:14:12.399 INFO Executor - Finished task 0.0 in stage 187.0 (TID 243). 651483 bytes result sent to driver
20:14:12.402 INFO TaskSetManager - Finished task 0.0 in stage 187.0 (TID 243) in 18 ms on localhost (executor driver) (1/1)
20:14:12.402 INFO TaskSchedulerImpl - Removed TaskSet 187.0, whose tasks have all completed, from pool
20:14:12.402 INFO DAGScheduler - ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.025 s
20:14:12.402 INFO DAGScheduler - Job 138 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.402 INFO TaskSchedulerImpl - Killing all running tasks in stage 187: Stage finished
20:14:12.402 INFO DAGScheduler - Job 138 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026307 s
20:14:12.412 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:12.412 INFO DAGScheduler - Got job 139 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:12.412 INFO DAGScheduler - Final stage: ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185)
20:14:12.412 INFO DAGScheduler - Parents of final stage: List()
20:14:12.412 INFO DAGScheduler - Missing parents: List()
20:14:12.412 INFO DAGScheduler - Submitting ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.429 INFO MemoryStore - Block broadcast_370 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:14:12.430 INFO MemoryStore - Block broadcast_370_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
20:14:12.430 INFO BlockManagerInfo - Added broadcast_370_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:12.430 INFO SparkContext - Created broadcast 370 from broadcast at DAGScheduler.scala:1580
20:14:12.430 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.430 INFO TaskSchedulerImpl - Adding task set 188.0 with 1 tasks resource profile 0
20:14:12.431 INFO TaskSetManager - Starting task 0.0 in stage 188.0 (TID 244) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:12.431 INFO Executor - Running task 0.0 in stage 188.0 (TID 244)
20:14:12.459 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:12.468 INFO Executor - Finished task 0.0 in stage 188.0 (TID 244). 989 bytes result sent to driver
20:14:12.469 INFO TaskSetManager - Finished task 0.0 in stage 188.0 (TID 244) in 38 ms on localhost (executor driver) (1/1)
20:14:12.469 INFO TaskSchedulerImpl - Removed TaskSet 188.0, whose tasks have all completed, from pool
20:14:12.469 INFO DAGScheduler - ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:12.469 INFO DAGScheduler - Job 139 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.469 INFO TaskSchedulerImpl - Killing all running tasks in stage 188: Stage finished
20:14:12.469 INFO DAGScheduler - Job 139 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057290 s
20:14:12.472 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:12.472 INFO DAGScheduler - Got job 140 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:12.472 INFO DAGScheduler - Final stage: ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185)
20:14:12.472 INFO DAGScheduler - Parents of final stage: List()
20:14:12.472 INFO DAGScheduler - Missing parents: List()
20:14:12.473 INFO DAGScheduler - Submitting ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.483 INFO MemoryStore - Block broadcast_371 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
20:14:12.484 INFO MemoryStore - Block broadcast_371_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
20:14:12.484 INFO BlockManagerInfo - Added broadcast_371_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.4 MiB)
20:14:12.484 INFO SparkContext - Created broadcast 371 from broadcast at DAGScheduler.scala:1580
20:14:12.484 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.484 INFO TaskSchedulerImpl - Adding task set 189.0 with 1 tasks resource profile 0
20:14:12.485 INFO TaskSetManager - Starting task 0.0 in stage 189.0 (TID 245) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:12.485 INFO Executor - Running task 0.0 in stage 189.0 (TID 245)
20:14:12.496 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19370255124586567562.bam:0+237038
20:14:12.499 INFO Executor - Finished task 0.0 in stage 189.0 (TID 245). 989 bytes result sent to driver
20:14:12.500 INFO TaskSetManager - Finished task 0.0 in stage 189.0 (TID 245) in 15 ms on localhost (executor driver) (1/1)
20:14:12.500 INFO TaskSchedulerImpl - Removed TaskSet 189.0, whose tasks have all completed, from pool
20:14:12.500 INFO DAGScheduler - ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.027 s
20:14:12.500 INFO DAGScheduler - Job 140 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.500 INFO TaskSchedulerImpl - Killing all running tasks in stage 189: Stage finished
20:14:12.500 INFO DAGScheduler - Job 140 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027915 s
20:14:12.509 INFO MemoryStore - Block broadcast_372 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
20:14:12.516 INFO MemoryStore - Block broadcast_372_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:14:12.516 INFO BlockManagerInfo - Added broadcast_372_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:12.516 INFO SparkContext - Created broadcast 372 from newAPIHadoopFile at PathSplitSource.java:96
20:14:12.537 INFO MemoryStore - Block broadcast_373 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:14:12.543 INFO MemoryStore - Block broadcast_373_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:12.544 INFO BlockManagerInfo - Added broadcast_373_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:12.544 INFO SparkContext - Created broadcast 373 from newAPIHadoopFile at PathSplitSource.java:96
20:14:12.563 INFO FileInputFormat - Total input files to process : 1
20:14:12.565 INFO MemoryStore - Block broadcast_374 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
20:14:12.566 INFO MemoryStore - Block broadcast_374_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
20:14:12.566 INFO BlockManagerInfo - Added broadcast_374_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:12.566 INFO SparkContext - Created broadcast 374 from broadcast at ReadsSparkSink.java:133
20:14:12.566 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:12.567 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:12.567 INFO MemoryStore - Block broadcast_375 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
20:14:12.568 INFO MemoryStore - Block broadcast_375_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
20:14:12.568 INFO BlockManagerInfo - Added broadcast_375_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:12.568 INFO SparkContext - Created broadcast 375 from broadcast at BamSink.java:76
20:14:12.570 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:12.570 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:12.570 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:12.587 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:12.588 INFO DAGScheduler - Registering RDD 904 (mapToPair at SparkUtils.java:161) as input to shuffle 38
20:14:12.588 INFO DAGScheduler - Got job 141 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:12.588 INFO DAGScheduler - Final stage: ResultStage 191 (runJob at SparkHadoopWriter.scala:83)
20:14:12.588 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 190)
20:14:12.588 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 190)
20:14:12.588 INFO DAGScheduler - Submitting ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:12.605 INFO MemoryStore - Block broadcast_376 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
20:14:12.606 INFO MemoryStore - Block broadcast_376_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
20:14:12.606 INFO BlockManagerInfo - Added broadcast_376_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.1 MiB)
20:14:12.607 INFO SparkContext - Created broadcast 376 from broadcast at DAGScheduler.scala:1580
20:14:12.607 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:12.607 INFO TaskSchedulerImpl - Adding task set 190.0 with 1 tasks resource profile 0
20:14:12.607 INFO TaskSetManager - Starting task 0.0 in stage 190.0 (TID 246) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:12.608 INFO Executor - Running task 0.0 in stage 190.0 (TID 246)
20:14:12.638 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:12.652 INFO Executor - Finished task 0.0 in stage 190.0 (TID 246). 1148 bytes result sent to driver
20:14:12.653 INFO TaskSetManager - Finished task 0.0 in stage 190.0 (TID 246) in 46 ms on localhost (executor driver) (1/1)
20:14:12.653 INFO TaskSchedulerImpl - Removed TaskSet 190.0, whose tasks have all completed, from pool
20:14:12.653 INFO DAGScheduler - ShuffleMapStage 190 (mapToPair at SparkUtils.java:161) finished in 0.065 s
20:14:12.653 INFO DAGScheduler - looking for newly runnable stages
20:14:12.653 INFO DAGScheduler - running: HashSet()
20:14:12.653 INFO DAGScheduler - waiting: HashSet(ResultStage 191)
20:14:12.653 INFO DAGScheduler - failed: HashSet()
20:14:12.653 INFO DAGScheduler - Submitting ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91), which has no missing parents
20:14:12.660 INFO MemoryStore - Block broadcast_377 stored as values in memory (estimated size 241.4 KiB, free 1915.1 MiB)
20:14:12.661 INFO MemoryStore - Block broadcast_377_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.1 MiB)
20:14:12.661 INFO BlockManagerInfo - Added broadcast_377_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.1 MiB)
20:14:12.661 INFO SparkContext - Created broadcast 377 from broadcast at DAGScheduler.scala:1580
20:14:12.661 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:12.661 INFO TaskSchedulerImpl - Adding task set 191.0 with 1 tasks resource profile 0
20:14:12.661 INFO TaskSetManager - Starting task 0.0 in stage 191.0 (TID 247) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:12.662 INFO Executor - Running task 0.0 in stage 191.0 (TID 247)
20:14:12.666 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:12.666 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:12.681 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:12.681 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:12.681 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:12.681 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:12.681 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:12.681 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:12.696 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014124040803296881431148_0909_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace17076270039831167244/_temporary/0/task_202502102014124040803296881431148_0909_r_000000
20:14:12.696 INFO SparkHadoopMapRedUtil - attempt_202502102014124040803296881431148_0909_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:12.696 INFO Executor - Finished task 0.0 in stage 191.0 (TID 247). 1858 bytes result sent to driver
20:14:12.697 INFO TaskSetManager - Finished task 0.0 in stage 191.0 (TID 247) in 36 ms on localhost (executor driver) (1/1)
20:14:12.697 INFO TaskSchedulerImpl - Removed TaskSet 191.0, whose tasks have all completed, from pool
20:14:12.697 INFO DAGScheduler - ResultStage 191 (runJob at SparkHadoopWriter.scala:83) finished in 0.044 s
20:14:12.697 INFO DAGScheduler - Job 141 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.697 INFO TaskSchedulerImpl - Killing all running tasks in stage 191: Stage finished
20:14:12.697 INFO DAGScheduler - Job 141 finished: runJob at SparkHadoopWriter.scala:83, took 0.109766 s
20:14:12.697 INFO SparkHadoopWriter - Start to commit write Job job_202502102014124040803296881431148_0909.
20:14:12.702 INFO SparkHadoopWriter - Write Job job_202502102014124040803296881431148_0909 committed. Elapsed time: 4 ms.
20:14:12.713 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest16658355723989109209.bam
20:14:12.717 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest16658355723989109209.bam done
20:14:12.719 INFO MemoryStore - Block broadcast_378 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
20:14:12.725 INFO MemoryStore - Block broadcast_378_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1914.7 MiB)
20:14:12.725 INFO BlockManagerInfo - Added broadcast_378_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.0 MiB)
20:14:12.725 INFO SparkContext - Created broadcast 378 from newAPIHadoopFile at PathSplitSource.java:96
20:14:12.745 INFO FileInputFormat - Total input files to process : 1
20:14:12.780 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:12.781 INFO DAGScheduler - Got job 142 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:12.781 INFO DAGScheduler - Final stage: ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:12.781 INFO DAGScheduler - Parents of final stage: List()
20:14:12.781 INFO DAGScheduler - Missing parents: List()
20:14:12.781 INFO DAGScheduler - Submitting ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.797 INFO MemoryStore - Block broadcast_379 stored as values in memory (estimated size 426.2 KiB, free 1914.3 MiB)
20:14:12.802 INFO BlockManagerInfo - Removed broadcast_374_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.0 MiB)
20:14:12.802 INFO BlockManagerInfo - Removed broadcast_375_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.0 MiB)
20:14:12.803 INFO MemoryStore - Block broadcast_379_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1914.5 MiB)
20:14:12.803 INFO BlockManagerInfo - Added broadcast_379_piece0 in memory on localhost:35739 (size: 153.7 KiB, free: 1918.9 MiB)
20:14:12.803 INFO BlockManagerInfo - Removed broadcast_366_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1918.9 MiB)
20:14:12.803 INFO SparkContext - Created broadcast 379 from broadcast at DAGScheduler.scala:1580
20:14:12.803 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.803 INFO TaskSchedulerImpl - Adding task set 192.0 with 1 tasks resource profile 0
20:14:12.803 INFO BlockManagerInfo - Removed broadcast_361_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.0 MiB)
20:14:12.804 INFO BlockManagerInfo - Removed broadcast_369_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.0 MiB)
20:14:12.804 INFO TaskSetManager - Starting task 0.0 in stage 192.0 (TID 248) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:12.804 INFO Executor - Running task 0.0 in stage 192.0 (TID 248)
20:14:12.805 INFO BlockManagerInfo - Removed broadcast_367_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.0 MiB)
20:14:12.805 INFO BlockManagerInfo - Removed broadcast_363_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.1 MiB)
20:14:12.806 INFO BlockManagerInfo - Removed broadcast_368_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:14:12.806 INFO BlockManagerInfo - Removed broadcast_373_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:14:12.807 INFO BlockManagerInfo - Removed broadcast_370_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.3 MiB)
20:14:12.807 INFO BlockManagerInfo - Removed broadcast_364_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.3 MiB)
20:14:12.808 INFO BlockManagerInfo - Removed broadcast_377_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.4 MiB)
20:14:12.808 INFO BlockManagerInfo - Removed broadcast_371_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.4 MiB)
20:14:12.809 INFO BlockManagerInfo - Removed broadcast_365_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.6 MiB)
20:14:12.810 INFO BlockManagerInfo - Removed broadcast_376_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.8 MiB)
20:14:12.836 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest16658355723989109209.bam:0+237038
20:14:12.849 INFO Executor - Finished task 0.0 in stage 192.0 (TID 248). 651483 bytes result sent to driver
20:14:12.851 INFO TaskSetManager - Finished task 0.0 in stage 192.0 (TID 248) in 47 ms on localhost (executor driver) (1/1)
20:14:12.851 INFO TaskSchedulerImpl - Removed TaskSet 192.0, whose tasks have all completed, from pool
20:14:12.851 INFO DAGScheduler - ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.070 s
20:14:12.851 INFO DAGScheduler - Job 142 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.851 INFO TaskSchedulerImpl - Killing all running tasks in stage 192: Stage finished
20:14:12.852 INFO DAGScheduler - Job 142 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.071220 s
20:14:12.866 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:12.867 INFO DAGScheduler - Got job 143 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:12.867 INFO DAGScheduler - Final stage: ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185)
20:14:12.867 INFO DAGScheduler - Parents of final stage: List()
20:14:12.867 INFO DAGScheduler - Missing parents: List()
20:14:12.867 INFO DAGScheduler - Submitting ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.893 INFO MemoryStore - Block broadcast_380 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
20:14:12.894 INFO MemoryStore - Block broadcast_380_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
20:14:12.895 INFO BlockManagerInfo - Added broadcast_380_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.6 MiB)
20:14:12.895 INFO SparkContext - Created broadcast 380 from broadcast at DAGScheduler.scala:1580
20:14:12.895 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.895 INFO TaskSchedulerImpl - Adding task set 193.0 with 1 tasks resource profile 0
20:14:12.895 INFO TaskSetManager - Starting task 0.0 in stage 193.0 (TID 249) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:12.896 INFO Executor - Running task 0.0 in stage 193.0 (TID 249)
20:14:12.924 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:12.933 INFO Executor - Finished task 0.0 in stage 193.0 (TID 249). 989 bytes result sent to driver
20:14:12.934 INFO TaskSetManager - Finished task 0.0 in stage 193.0 (TID 249) in 39 ms on localhost (executor driver) (1/1)
20:14:12.934 INFO TaskSchedulerImpl - Removed TaskSet 193.0, whose tasks have all completed, from pool
20:14:12.934 INFO DAGScheduler - ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.067 s
20:14:12.934 INFO DAGScheduler - Job 143 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.934 INFO TaskSchedulerImpl - Killing all running tasks in stage 193: Stage finished
20:14:12.934 INFO DAGScheduler - Job 143 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.067404 s
20:14:12.937 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:12.937 INFO DAGScheduler - Got job 144 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:12.937 INFO DAGScheduler - Final stage: ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185)
20:14:12.937 INFO DAGScheduler - Parents of final stage: List()
20:14:12.937 INFO DAGScheduler - Missing parents: List()
20:14:12.937 INFO DAGScheduler - Submitting ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:12.954 INFO MemoryStore - Block broadcast_381 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:14:12.955 INFO MemoryStore - Block broadcast_381_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:14:12.955 INFO BlockManagerInfo - Added broadcast_381_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:12.955 INFO SparkContext - Created broadcast 381 from broadcast at DAGScheduler.scala:1580
20:14:12.956 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:12.956 INFO TaskSchedulerImpl - Adding task set 194.0 with 1 tasks resource profile 0
20:14:12.956 INFO TaskSetManager - Starting task 0.0 in stage 194.0 (TID 250) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:12.956 INFO Executor - Running task 0.0 in stage 194.0 (TID 250)
20:14:12.984 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest16658355723989109209.bam:0+237038
20:14:12.994 INFO Executor - Finished task 0.0 in stage 194.0 (TID 250). 989 bytes result sent to driver
20:14:12.995 INFO TaskSetManager - Finished task 0.0 in stage 194.0 (TID 250) in 39 ms on localhost (executor driver) (1/1)
20:14:12.995 INFO TaskSchedulerImpl - Removed TaskSet 194.0, whose tasks have all completed, from pool
20:14:12.995 INFO DAGScheduler - ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:12.995 INFO DAGScheduler - Job 144 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:12.995 INFO TaskSchedulerImpl - Killing all running tasks in stage 194: Stage finished
20:14:12.995 INFO DAGScheduler - Job 144 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058106 s
20:14:13.003 INFO MemoryStore - Block broadcast_382 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
20:14:13.009 INFO MemoryStore - Block broadcast_382_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
20:14:13.009 INFO BlockManagerInfo - Added broadcast_382_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:14:13.009 INFO SparkContext - Created broadcast 382 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.030 INFO MemoryStore - Block broadcast_383 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
20:14:13.036 INFO MemoryStore - Block broadcast_383_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.9 MiB)
20:14:13.036 INFO BlockManagerInfo - Added broadcast_383_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:14:13.037 INFO SparkContext - Created broadcast 383 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.057 INFO FileInputFormat - Total input files to process : 1
20:14:13.059 INFO MemoryStore - Block broadcast_384 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:14:13.060 INFO MemoryStore - Block broadcast_384_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:14:13.060 INFO BlockManagerInfo - Added broadcast_384_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:13.061 INFO SparkContext - Created broadcast 384 from broadcast at ReadsSparkSink.java:133
20:14:13.062 INFO MemoryStore - Block broadcast_385 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
20:14:13.063 INFO MemoryStore - Block broadcast_385_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
20:14:13.063 INFO BlockManagerInfo - Added broadcast_385_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:13.063 INFO SparkContext - Created broadcast 385 from broadcast at BamSink.java:76
20:14:13.066 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.066 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.066 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.087 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:13.087 INFO DAGScheduler - Registering RDD 930 (mapToPair at SparkUtils.java:161) as input to shuffle 39
20:14:13.087 INFO DAGScheduler - Got job 145 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:13.087 INFO DAGScheduler - Final stage: ResultStage 196 (runJob at SparkHadoopWriter.scala:83)
20:14:13.087 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 195)
20:14:13.087 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 195)
20:14:13.088 INFO DAGScheduler - Submitting ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:13.105 INFO MemoryStore - Block broadcast_386 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
20:14:13.106 INFO MemoryStore - Block broadcast_386_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
20:14:13.106 INFO BlockManagerInfo - Added broadcast_386_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.2 MiB)
20:14:13.106 INFO SparkContext - Created broadcast 386 from broadcast at DAGScheduler.scala:1580
20:14:13.107 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:13.107 INFO TaskSchedulerImpl - Adding task set 195.0 with 1 tasks resource profile 0
20:14:13.107 INFO TaskSetManager - Starting task 0.0 in stage 195.0 (TID 251) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
20:14:13.107 INFO Executor - Running task 0.0 in stage 195.0 (TID 251)
20:14:13.137 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:14:13.154 INFO Executor - Finished task 0.0 in stage 195.0 (TID 251). 1148 bytes result sent to driver
20:14:13.154 INFO TaskSetManager - Finished task 0.0 in stage 195.0 (TID 251) in 47 ms on localhost (executor driver) (1/1)
20:14:13.154 INFO TaskSchedulerImpl - Removed TaskSet 195.0, whose tasks have all completed, from pool
20:14:13.154 INFO DAGScheduler - ShuffleMapStage 195 (mapToPair at SparkUtils.java:161) finished in 0.066 s
20:14:13.154 INFO DAGScheduler - looking for newly runnable stages
20:14:13.154 INFO DAGScheduler - running: HashSet()
20:14:13.154 INFO DAGScheduler - waiting: HashSet(ResultStage 196)
20:14:13.154 INFO DAGScheduler - failed: HashSet()
20:14:13.154 INFO DAGScheduler - Submitting ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91), which has no missing parents
20:14:13.165 INFO MemoryStore - Block broadcast_387 stored as values in memory (estimated size 241.4 KiB, free 1915.7 MiB)
20:14:13.166 INFO MemoryStore - Block broadcast_387_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.6 MiB)
20:14:13.166 INFO BlockManagerInfo - Added broadcast_387_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.1 MiB)
20:14:13.167 INFO SparkContext - Created broadcast 387 from broadcast at DAGScheduler.scala:1580
20:14:13.167 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:13.167 INFO TaskSchedulerImpl - Adding task set 196.0 with 1 tasks resource profile 0
20:14:13.167 INFO TaskSetManager - Starting task 0.0 in stage 196.0 (TID 252) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:13.168 INFO Executor - Running task 0.0 in stage 196.0 (TID 252)
20:14:13.174 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:13.174 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:13.187 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.187 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.187 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.187 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.187 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.187 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.213 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014132899921334266619157_0935_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest2.someOtherPlace16110962243356412088/_temporary/0/task_202502102014132899921334266619157_0935_r_000000
20:14:13.213 INFO SparkHadoopMapRedUtil - attempt_202502102014132899921334266619157_0935_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:13.214 INFO Executor - Finished task 0.0 in stage 196.0 (TID 252). 1858 bytes result sent to driver
20:14:13.214 INFO TaskSetManager - Finished task 0.0 in stage 196.0 (TID 252) in 47 ms on localhost (executor driver) (1/1)
20:14:13.214 INFO TaskSchedulerImpl - Removed TaskSet 196.0, whose tasks have all completed, from pool
20:14:13.215 INFO DAGScheduler - ResultStage 196 (runJob at SparkHadoopWriter.scala:83) finished in 0.059 s
20:14:13.215 INFO DAGScheduler - Job 145 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.215 INFO TaskSchedulerImpl - Killing all running tasks in stage 196: Stage finished
20:14:13.215 INFO DAGScheduler - Job 145 finished: runJob at SparkHadoopWriter.scala:83, took 0.127865 s
20:14:13.215 INFO SparkHadoopWriter - Start to commit write Job job_202502102014132899921334266619157_0935.
20:14:13.220 INFO SparkHadoopWriter - Write Job job_202502102014132899921334266619157_0935 committed. Elapsed time: 4 ms.
20:14:13.232 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest27223876377055394160.bam
20:14:13.236 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest27223876377055394160.bam done
20:14:13.236 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace16110962243356412088 to /tmp/ReadsSparkSinkUnitTest27223876377055394160.bam.sbi
20:14:13.241 INFO IndexFileMerger - Done merging .sbi files
20:14:13.241 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace16110962243356412088 to /tmp/ReadsSparkSinkUnitTest27223876377055394160.bam.bai
20:14:13.246 INFO IndexFileMerger - Done merging .bai files
20:14:13.249 INFO MemoryStore - Block broadcast_388 stored as values in memory (estimated size 320.0 B, free 1915.6 MiB)
20:14:13.249 INFO MemoryStore - Block broadcast_388_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.6 MiB)
20:14:13.249 INFO BlockManagerInfo - Added broadcast_388_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.1 MiB)
20:14:13.249 INFO SparkContext - Created broadcast 388 from broadcast at BamSource.java:104
20:14:13.250 INFO MemoryStore - Block broadcast_389 stored as values in memory (estimated size 297.9 KiB, free 1915.3 MiB)
20:14:13.256 INFO MemoryStore - Block broadcast_389_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
20:14:13.257 INFO BlockManagerInfo - Added broadcast_389_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:14:13.257 INFO SparkContext - Created broadcast 389 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.265 INFO FileInputFormat - Total input files to process : 1
20:14:13.280 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:13.280 INFO DAGScheduler - Got job 146 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:13.280 INFO DAGScheduler - Final stage: ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:13.280 INFO DAGScheduler - Parents of final stage: List()
20:14:13.280 INFO DAGScheduler - Missing parents: List()
20:14:13.280 INFO DAGScheduler - Submitting ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:13.286 INFO MemoryStore - Block broadcast_390 stored as values in memory (estimated size 148.2 KiB, free 1915.1 MiB)
20:14:13.287 INFO MemoryStore - Block broadcast_390_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.1 MiB)
20:14:13.287 INFO BlockManagerInfo - Added broadcast_390_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.0 MiB)
20:14:13.287 INFO SparkContext - Created broadcast 390 from broadcast at DAGScheduler.scala:1580
20:14:13.287 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:13.287 INFO TaskSchedulerImpl - Adding task set 197.0 with 1 tasks resource profile 0
20:14:13.288 INFO TaskSetManager - Starting task 0.0 in stage 197.0 (TID 253) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:13.288 INFO Executor - Running task 0.0 in stage 197.0 (TID 253)
20:14:13.299 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest27223876377055394160.bam:0+235514
20:14:13.304 INFO Executor - Finished task 0.0 in stage 197.0 (TID 253). 650184 bytes result sent to driver
20:14:13.305 INFO TaskSetManager - Finished task 0.0 in stage 197.0 (TID 253) in 17 ms on localhost (executor driver) (1/1)
20:14:13.305 INFO TaskSchedulerImpl - Removed TaskSet 197.0, whose tasks have all completed, from pool
20:14:13.305 INFO DAGScheduler - ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.024 s
20:14:13.306 INFO DAGScheduler - Job 146 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.306 INFO TaskSchedulerImpl - Killing all running tasks in stage 197: Stage finished
20:14:13.306 INFO DAGScheduler - Job 146 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.025658 s
20:14:13.315 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:13.315 INFO DAGScheduler - Got job 147 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:13.315 INFO DAGScheduler - Final stage: ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185)
20:14:13.315 INFO DAGScheduler - Parents of final stage: List()
20:14:13.316 INFO DAGScheduler - Missing parents: List()
20:14:13.316 INFO DAGScheduler - Submitting ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:13.332 INFO MemoryStore - Block broadcast_391 stored as values in memory (estimated size 426.1 KiB, free 1914.7 MiB)
20:14:13.333 INFO MemoryStore - Block broadcast_391_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1914.5 MiB)
20:14:13.333 INFO BlockManagerInfo - Added broadcast_391_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1918.9 MiB)
20:14:13.334 INFO SparkContext - Created broadcast 391 from broadcast at DAGScheduler.scala:1580
20:14:13.334 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:13.334 INFO TaskSchedulerImpl - Adding task set 198.0 with 1 tasks resource profile 0
20:14:13.334 INFO TaskSetManager - Starting task 0.0 in stage 198.0 (TID 254) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
20:14:13.335 INFO Executor - Running task 0.0 in stage 198.0 (TID 254)
20:14:13.363 INFO BlockManagerInfo - Removed broadcast_384_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1918.9 MiB)
20:14:13.364 INFO BlockManagerInfo - Removed broadcast_380_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.0 MiB)
20:14:13.364 INFO BlockManagerInfo - Removed broadcast_386_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.2 MiB)
20:14:13.365 INFO BlockManagerInfo - Removed broadcast_390_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.2 MiB)
20:14:13.365 INFO BlockManagerInfo - Removed broadcast_379_piece0 on localhost:35739 in memory (size: 153.7 KiB, free: 1919.4 MiB)
20:14:13.366 INFO BlockManagerInfo - Removed broadcast_378_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:13.366 INFO BlockManagerInfo - Removed broadcast_372_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:13.366 INFO BlockManagerInfo - Removed broadcast_385_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:13.367 INFO BlockManagerInfo - Removed broadcast_387_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.6 MiB)
20:14:13.367 INFO BlockManagerInfo - Removed broadcast_383_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.6 MiB)
20:14:13.368 INFO BlockManagerInfo - Removed broadcast_381_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:14:13.370 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:14:13.381 INFO Executor - Finished task 0.0 in stage 198.0 (TID 254). 1032 bytes result sent to driver
20:14:13.382 INFO TaskSetManager - Finished task 0.0 in stage 198.0 (TID 254) in 48 ms on localhost (executor driver) (1/1)
20:14:13.382 INFO TaskSchedulerImpl - Removed TaskSet 198.0, whose tasks have all completed, from pool
20:14:13.382 INFO DAGScheduler - ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.066 s
20:14:13.382 INFO DAGScheduler - Job 147 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.382 INFO TaskSchedulerImpl - Killing all running tasks in stage 198: Stage finished
20:14:13.382 INFO DAGScheduler - Job 147 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.067060 s
20:14:13.386 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:13.387 INFO DAGScheduler - Got job 148 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:13.387 INFO DAGScheduler - Final stage: ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185)
20:14:13.387 INFO DAGScheduler - Parents of final stage: List()
20:14:13.387 INFO DAGScheduler - Missing parents: List()
20:14:13.387 INFO DAGScheduler - Submitting ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:13.397 INFO MemoryStore - Block broadcast_392 stored as values in memory (estimated size 148.1 KiB, free 1918.6 MiB)
20:14:13.397 INFO MemoryStore - Block broadcast_392_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.6 MiB)
20:14:13.398 INFO BlockManagerInfo - Added broadcast_392_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.7 MiB)
20:14:13.398 INFO SparkContext - Created broadcast 392 from broadcast at DAGScheduler.scala:1580
20:14:13.398 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:13.398 INFO TaskSchedulerImpl - Adding task set 199.0 with 1 tasks resource profile 0
20:14:13.398 INFO TaskSetManager - Starting task 0.0 in stage 199.0 (TID 255) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:13.399 INFO Executor - Running task 0.0 in stage 199.0 (TID 255)
20:14:13.410 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest27223876377055394160.bam:0+235514
20:14:13.413 INFO Executor - Finished task 0.0 in stage 199.0 (TID 255). 989 bytes result sent to driver
20:14:13.414 INFO TaskSetManager - Finished task 0.0 in stage 199.0 (TID 255) in 16 ms on localhost (executor driver) (1/1)
20:14:13.414 INFO TaskSchedulerImpl - Removed TaskSet 199.0, whose tasks have all completed, from pool
20:14:13.414 INFO DAGScheduler - ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.027 s
20:14:13.414 INFO DAGScheduler - Job 148 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.414 INFO TaskSchedulerImpl - Killing all running tasks in stage 199: Stage finished
20:14:13.414 INFO DAGScheduler - Job 148 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027739 s
20:14:13.422 INFO MemoryStore - Block broadcast_393 stored as values in memory (estimated size 298.0 KiB, free 1918.3 MiB)
20:14:13.429 INFO MemoryStore - Block broadcast_393_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:13.429 INFO BlockManagerInfo - Added broadcast_393_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:13.429 INFO SparkContext - Created broadcast 393 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.450 INFO MemoryStore - Block broadcast_394 stored as values in memory (estimated size 298.0 KiB, free 1917.9 MiB)
20:14:13.461 INFO MemoryStore - Block broadcast_394_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
20:14:13.461 INFO BlockManagerInfo - Added broadcast_394_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:13.461 INFO SparkContext - Created broadcast 394 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.488 INFO FileInputFormat - Total input files to process : 1
20:14:13.490 INFO MemoryStore - Block broadcast_395 stored as values in memory (estimated size 19.6 KiB, free 1917.9 MiB)
20:14:13.490 INFO MemoryStore - Block broadcast_395_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.9 MiB)
20:14:13.490 INFO BlockManagerInfo - Added broadcast_395_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.6 MiB)
20:14:13.490 INFO SparkContext - Created broadcast 395 from broadcast at ReadsSparkSink.java:133
20:14:13.491 INFO MemoryStore - Block broadcast_396 stored as values in memory (estimated size 20.0 KiB, free 1917.8 MiB)
20:14:13.492 INFO MemoryStore - Block broadcast_396_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.8 MiB)
20:14:13.492 INFO BlockManagerInfo - Added broadcast_396_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.6 MiB)
20:14:13.492 INFO SparkContext - Created broadcast 396 from broadcast at BamSink.java:76
20:14:13.494 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.494 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.494 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.510 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:13.511 INFO DAGScheduler - Registering RDD 955 (mapToPair at SparkUtils.java:161) as input to shuffle 40
20:14:13.511 INFO DAGScheduler - Got job 149 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:13.511 INFO DAGScheduler - Final stage: ResultStage 201 (runJob at SparkHadoopWriter.scala:83)
20:14:13.511 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 200)
20:14:13.511 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 200)
20:14:13.511 INFO DAGScheduler - Submitting ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:13.528 INFO MemoryStore - Block broadcast_397 stored as values in memory (estimated size 434.3 KiB, free 1917.4 MiB)
20:14:13.529 INFO MemoryStore - Block broadcast_397_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1917.3 MiB)
20:14:13.530 INFO BlockManagerInfo - Added broadcast_397_piece0 in memory on localhost:35739 (size: 157.6 KiB, free: 1919.4 MiB)
20:14:13.530 INFO SparkContext - Created broadcast 397 from broadcast at DAGScheduler.scala:1580
20:14:13.530 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:13.530 INFO TaskSchedulerImpl - Adding task set 200.0 with 1 tasks resource profile 0
20:14:13.530 INFO TaskSetManager - Starting task 0.0 in stage 200.0 (TID 256) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
20:14:13.531 INFO Executor - Running task 0.0 in stage 200.0 (TID 256)
20:14:13.561 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:13.576 INFO Executor - Finished task 0.0 in stage 200.0 (TID 256). 1148 bytes result sent to driver
20:14:13.577 INFO TaskSetManager - Finished task 0.0 in stage 200.0 (TID 256) in 47 ms on localhost (executor driver) (1/1)
20:14:13.577 INFO TaskSchedulerImpl - Removed TaskSet 200.0, whose tasks have all completed, from pool
20:14:13.577 INFO DAGScheduler - ShuffleMapStage 200 (mapToPair at SparkUtils.java:161) finished in 0.066 s
20:14:13.577 INFO DAGScheduler - looking for newly runnable stages
20:14:13.577 INFO DAGScheduler - running: HashSet()
20:14:13.577 INFO DAGScheduler - waiting: HashSet(ResultStage 201)
20:14:13.577 INFO DAGScheduler - failed: HashSet()
20:14:13.577 INFO DAGScheduler - Submitting ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91), which has no missing parents
20:14:13.587 INFO MemoryStore - Block broadcast_398 stored as values in memory (estimated size 155.3 KiB, free 1917.1 MiB)
20:14:13.588 INFO MemoryStore - Block broadcast_398_piece0 stored as bytes in memory (estimated size 58.4 KiB, free 1917.0 MiB)
20:14:13.588 INFO BlockManagerInfo - Added broadcast_398_piece0 in memory on localhost:35739 (size: 58.4 KiB, free: 1919.4 MiB)
20:14:13.588 INFO SparkContext - Created broadcast 398 from broadcast at DAGScheduler.scala:1580
20:14:13.588 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:13.588 INFO TaskSchedulerImpl - Adding task set 201.0 with 1 tasks resource profile 0
20:14:13.589 INFO TaskSetManager - Starting task 0.0 in stage 201.0 (TID 257) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:13.589 INFO Executor - Running task 0.0 in stage 201.0 (TID 257)
20:14:13.595 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:13.595 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:13.609 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.609 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.609 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.609 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.609 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.609 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.632 INFO FileOutputCommitter - Saved output of task 'attempt_20250210201413352024639135632865_0960_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest3.someOtherPlace489452279910583555/_temporary/0/task_20250210201413352024639135632865_0960_r_000000
20:14:13.632 INFO SparkHadoopMapRedUtil - attempt_20250210201413352024639135632865_0960_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:13.633 INFO Executor - Finished task 0.0 in stage 201.0 (TID 257). 1858 bytes result sent to driver
20:14:13.633 INFO TaskSetManager - Finished task 0.0 in stage 201.0 (TID 257) in 45 ms on localhost (executor driver) (1/1)
20:14:13.633 INFO TaskSchedulerImpl - Removed TaskSet 201.0, whose tasks have all completed, from pool
20:14:13.633 INFO DAGScheduler - ResultStage 201 (runJob at SparkHadoopWriter.scala:83) finished in 0.056 s
20:14:13.633 INFO DAGScheduler - Job 149 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.633 INFO TaskSchedulerImpl - Killing all running tasks in stage 201: Stage finished
20:14:13.633 INFO DAGScheduler - Job 149 finished: runJob at SparkHadoopWriter.scala:83, took 0.122892 s
20:14:13.634 INFO SparkHadoopWriter - Start to commit write Job job_20250210201413352024639135632865_0960.
20:14:13.638 INFO SparkHadoopWriter - Write Job job_20250210201413352024639135632865_0960 committed. Elapsed time: 4 ms.
20:14:13.649 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest37046961452811596762.bam
20:14:13.653 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest37046961452811596762.bam done
20:14:13.653 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace489452279910583555 to /tmp/ReadsSparkSinkUnitTest37046961452811596762.bam.sbi
20:14:13.658 INFO IndexFileMerger - Done merging .sbi files
20:14:13.658 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace489452279910583555 to /tmp/ReadsSparkSinkUnitTest37046961452811596762.bam.bai
20:14:13.662 INFO IndexFileMerger - Done merging .bai files
20:14:13.664 INFO MemoryStore - Block broadcast_399 stored as values in memory (estimated size 312.0 B, free 1917.0 MiB)
20:14:13.664 INFO MemoryStore - Block broadcast_399_piece0 stored as bytes in memory (estimated size 231.0 B, free 1917.0 MiB)
20:14:13.664 INFO BlockManagerInfo - Added broadcast_399_piece0 in memory on localhost:35739 (size: 231.0 B, free: 1919.4 MiB)
20:14:13.664 INFO SparkContext - Created broadcast 399 from broadcast at BamSource.java:104
20:14:13.666 INFO MemoryStore - Block broadcast_400 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
20:14:13.676 INFO MemoryStore - Block broadcast_400_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:14:13.676 INFO BlockManagerInfo - Added broadcast_400_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:13.677 INFO SparkContext - Created broadcast 400 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.690 INFO FileInputFormat - Total input files to process : 1
20:14:13.705 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:13.705 INFO DAGScheduler - Got job 150 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:13.705 INFO DAGScheduler - Final stage: ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:13.705 INFO DAGScheduler - Parents of final stage: List()
20:14:13.705 INFO DAGScheduler - Missing parents: List()
20:14:13.705 INFO DAGScheduler - Submitting ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:13.715 INFO MemoryStore - Block broadcast_401 stored as values in memory (estimated size 148.2 KiB, free 1916.6 MiB)
20:14:13.716 INFO MemoryStore - Block broadcast_401_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.5 MiB)
20:14:13.716 INFO BlockManagerInfo - Added broadcast_401_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.3 MiB)
20:14:13.716 INFO SparkContext - Created broadcast 401 from broadcast at DAGScheduler.scala:1580
20:14:13.716 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:13.716 INFO TaskSchedulerImpl - Adding task set 202.0 with 1 tasks resource profile 0
20:14:13.717 INFO TaskSetManager - Starting task 0.0 in stage 202.0 (TID 258) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:13.717 INFO Executor - Running task 0.0 in stage 202.0 (TID 258)
20:14:13.729 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest37046961452811596762.bam:0+236517
20:14:13.733 INFO Executor - Finished task 0.0 in stage 202.0 (TID 258). 749470 bytes result sent to driver
20:14:13.735 INFO TaskSetManager - Finished task 0.0 in stage 202.0 (TID 258) in 18 ms on localhost (executor driver) (1/1)
20:14:13.735 INFO TaskSchedulerImpl - Removed TaskSet 202.0, whose tasks have all completed, from pool
20:14:13.735 INFO DAGScheduler - ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.030 s
20:14:13.735 INFO DAGScheduler - Job 150 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.735 INFO TaskSchedulerImpl - Killing all running tasks in stage 202: Stage finished
20:14:13.735 INFO DAGScheduler - Job 150 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.030623 s
20:14:13.745 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:13.745 INFO DAGScheduler - Got job 151 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:13.745 INFO DAGScheduler - Final stage: ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185)
20:14:13.746 INFO DAGScheduler - Parents of final stage: List()
20:14:13.746 INFO DAGScheduler - Missing parents: List()
20:14:13.746 INFO DAGScheduler - Submitting ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:13.763 INFO MemoryStore - Block broadcast_402 stored as values in memory (estimated size 426.1 KiB, free 1916.1 MiB)
20:14:13.765 INFO MemoryStore - Block broadcast_402_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.9 MiB)
20:14:13.765 INFO BlockManagerInfo - Added broadcast_402_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.1 MiB)
20:14:13.765 INFO SparkContext - Created broadcast 402 from broadcast at DAGScheduler.scala:1580
20:14:13.765 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:13.765 INFO TaskSchedulerImpl - Adding task set 203.0 with 1 tasks resource profile 0
20:14:13.766 INFO TaskSetManager - Starting task 0.0 in stage 203.0 (TID 259) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
20:14:13.766 INFO Executor - Running task 0.0 in stage 203.0 (TID 259)
20:14:13.796 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:13.803 INFO Executor - Finished task 0.0 in stage 203.0 (TID 259). 989 bytes result sent to driver
20:14:13.803 INFO TaskSetManager - Finished task 0.0 in stage 203.0 (TID 259) in 37 ms on localhost (executor driver) (1/1)
20:14:13.803 INFO TaskSchedulerImpl - Removed TaskSet 203.0, whose tasks have all completed, from pool
20:14:13.803 INFO DAGScheduler - ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:13.803 INFO DAGScheduler - Job 151 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.803 INFO TaskSchedulerImpl - Killing all running tasks in stage 203: Stage finished
20:14:13.803 INFO DAGScheduler - Job 151 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058172 s
20:14:13.807 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:13.807 INFO DAGScheduler - Got job 152 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:13.807 INFO DAGScheduler - Final stage: ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185)
20:14:13.807 INFO DAGScheduler - Parents of final stage: List()
20:14:13.807 INFO DAGScheduler - Missing parents: List()
20:14:13.807 INFO DAGScheduler - Submitting ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:13.813 INFO MemoryStore - Block broadcast_403 stored as values in memory (estimated size 148.1 KiB, free 1915.8 MiB)
20:14:13.814 INFO MemoryStore - Block broadcast_403_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.7 MiB)
20:14:13.814 INFO BlockManagerInfo - Added broadcast_403_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.1 MiB)
20:14:13.814 INFO SparkContext - Created broadcast 403 from broadcast at DAGScheduler.scala:1580
20:14:13.814 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:13.814 INFO TaskSchedulerImpl - Adding task set 204.0 with 1 tasks resource profile 0
20:14:13.815 INFO TaskSetManager - Starting task 0.0 in stage 204.0 (TID 260) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:13.815 INFO Executor - Running task 0.0 in stage 204.0 (TID 260)
20:14:13.826 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest37046961452811596762.bam:0+236517
20:14:13.828 INFO Executor - Finished task 0.0 in stage 204.0 (TID 260). 989 bytes result sent to driver
20:14:13.829 INFO TaskSetManager - Finished task 0.0 in stage 204.0 (TID 260) in 14 ms on localhost (executor driver) (1/1)
20:14:13.829 INFO TaskSchedulerImpl - Removed TaskSet 204.0, whose tasks have all completed, from pool
20:14:13.829 INFO DAGScheduler - ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
20:14:13.829 INFO DAGScheduler - Job 152 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:13.829 INFO TaskSchedulerImpl - Killing all running tasks in stage 204: Stage finished
20:14:13.829 INFO DAGScheduler - Job 152 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022206 s
20:14:13.836 INFO MemoryStore - Block broadcast_404 stored as values in memory (estimated size 576.0 B, free 1915.7 MiB)
20:14:13.840 INFO MemoryStore - Block broadcast_404_piece0 stored as bytes in memory (estimated size 228.0 B, free 1915.7 MiB)
20:14:13.841 INFO BlockManagerInfo - Added broadcast_404_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.1 MiB)
20:14:13.841 INFO SparkContext - Created broadcast 404 from broadcast at CramSource.java:114
20:14:13.841 INFO BlockManagerInfo - Removed broadcast_388_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.1 MiB)
20:14:13.841 INFO BlockManagerInfo - Removed broadcast_400_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.1 MiB)
20:14:13.842 INFO BlockManagerInfo - Removed broadcast_393_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.2 MiB)
20:14:13.842 INFO MemoryStore - Block broadcast_405 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
20:14:13.843 INFO BlockManagerInfo - Removed broadcast_398_piece0 on localhost:35739 in memory (size: 58.4 KiB, free: 1919.2 MiB)
20:14:13.844 INFO BlockManagerInfo - Removed broadcast_392_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.3 MiB)
20:14:13.844 INFO BlockManagerInfo - Removed broadcast_394_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:13.844 INFO BlockManagerInfo - Removed broadcast_391_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.5 MiB)
20:14:13.845 INFO BlockManagerInfo - Removed broadcast_382_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.5 MiB)
20:14:13.845 INFO BlockManagerInfo - Removed broadcast_389_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:13.846 INFO BlockManagerInfo - Removed broadcast_401_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.6 MiB)
20:14:13.846 INFO BlockManagerInfo - Removed broadcast_399_piece0 on localhost:35739 in memory (size: 231.0 B, free: 1919.6 MiB)
20:14:13.848 INFO BlockManagerInfo - Removed broadcast_396_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.6 MiB)
20:14:13.848 INFO BlockManagerInfo - Removed broadcast_395_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.6 MiB)
20:14:13.849 INFO BlockManagerInfo - Removed broadcast_402_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:14:13.849 INFO BlockManagerInfo - Removed broadcast_397_piece0 on localhost:35739 in memory (size: 157.6 KiB, free: 1919.9 MiB)
20:14:13.850 INFO BlockManagerInfo - Removed broadcast_403_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1920.0 MiB)
20:14:13.851 INFO MemoryStore - Block broadcast_405_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.7 MiB)
20:14:13.851 INFO BlockManagerInfo - Added broadcast_405_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1920.0 MiB)
20:14:13.851 INFO SparkContext - Created broadcast 405 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.866 INFO MemoryStore - Block broadcast_406 stored as values in memory (estimated size 576.0 B, free 1919.7 MiB)
20:14:13.867 INFO MemoryStore - Block broadcast_406_piece0 stored as bytes in memory (estimated size 228.0 B, free 1919.7 MiB)
20:14:13.867 INFO BlockManagerInfo - Added broadcast_406_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1920.0 MiB)
20:14:13.867 INFO SparkContext - Created broadcast 406 from broadcast at CramSource.java:114
20:14:13.868 INFO MemoryStore - Block broadcast_407 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
20:14:13.874 INFO MemoryStore - Block broadcast_407_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:14:13.874 INFO BlockManagerInfo - Added broadcast_407_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:13.874 INFO SparkContext - Created broadcast 407 from newAPIHadoopFile at PathSplitSource.java:96
20:14:13.888 INFO FileInputFormat - Total input files to process : 1
20:14:13.889 INFO MemoryStore - Block broadcast_408 stored as values in memory (estimated size 6.0 KiB, free 1919.3 MiB)
20:14:13.889 INFO MemoryStore - Block broadcast_408_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
20:14:13.889 INFO BlockManagerInfo - Added broadcast_408_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.9 MiB)
20:14:13.889 INFO SparkContext - Created broadcast 408 from broadcast at ReadsSparkSink.java:133
20:14:13.890 INFO MemoryStore - Block broadcast_409 stored as values in memory (estimated size 6.2 KiB, free 1919.3 MiB)
20:14:13.890 INFO MemoryStore - Block broadcast_409_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
20:14:13.890 INFO BlockManagerInfo - Added broadcast_409_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.9 MiB)
20:14:13.891 INFO SparkContext - Created broadcast 409 from broadcast at CramSink.java:76
20:14:13.892 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.892 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.892 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.909 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:13.910 INFO DAGScheduler - Registering RDD 978 (mapToPair at SparkUtils.java:161) as input to shuffle 41
20:14:13.910 INFO DAGScheduler - Got job 153 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:13.910 INFO DAGScheduler - Final stage: ResultStage 206 (runJob at SparkHadoopWriter.scala:83)
20:14:13.910 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 205)
20:14:13.910 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 205)
20:14:13.910 INFO DAGScheduler - Submitting ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:13.921 INFO MemoryStore - Block broadcast_410 stored as values in memory (estimated size 292.8 KiB, free 1919.0 MiB)
20:14:13.922 INFO MemoryStore - Block broadcast_410_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1918.9 MiB)
20:14:13.922 INFO BlockManagerInfo - Added broadcast_410_piece0 in memory on localhost:35739 (size: 107.3 KiB, free: 1919.8 MiB)
20:14:13.923 INFO SparkContext - Created broadcast 410 from broadcast at DAGScheduler.scala:1580
20:14:13.923 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:13.923 INFO TaskSchedulerImpl - Adding task set 205.0 with 1 tasks resource profile 0
20:14:13.923 INFO TaskSetManager - Starting task 0.0 in stage 205.0 (TID 261) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
20:14:13.924 INFO Executor - Running task 0.0 in stage 205.0 (TID 261)
20:14:13.944 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:13.954 INFO Executor - Finished task 0.0 in stage 205.0 (TID 261). 1148 bytes result sent to driver
20:14:13.954 INFO TaskSetManager - Finished task 0.0 in stage 205.0 (TID 261) in 31 ms on localhost (executor driver) (1/1)
20:14:13.954 INFO TaskSchedulerImpl - Removed TaskSet 205.0, whose tasks have all completed, from pool
20:14:13.955 INFO DAGScheduler - ShuffleMapStage 205 (mapToPair at SparkUtils.java:161) finished in 0.045 s
20:14:13.955 INFO DAGScheduler - looking for newly runnable stages
20:14:13.955 INFO DAGScheduler - running: HashSet()
20:14:13.955 INFO DAGScheduler - waiting: HashSet(ResultStage 206)
20:14:13.955 INFO DAGScheduler - failed: HashSet()
20:14:13.955 INFO DAGScheduler - Submitting ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89), which has no missing parents
20:14:13.966 INFO MemoryStore - Block broadcast_411 stored as values in memory (estimated size 153.2 KiB, free 1918.8 MiB)
20:14:13.967 INFO MemoryStore - Block broadcast_411_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1918.7 MiB)
20:14:13.967 INFO BlockManagerInfo - Added broadcast_411_piece0 in memory on localhost:35739 (size: 58.1 KiB, free: 1919.7 MiB)
20:14:13.967 INFO SparkContext - Created broadcast 411 from broadcast at DAGScheduler.scala:1580
20:14:13.967 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
20:14:13.967 INFO TaskSchedulerImpl - Adding task set 206.0 with 1 tasks resource profile 0
20:14:13.968 INFO TaskSetManager - Starting task 0.0 in stage 206.0 (TID 262) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:13.968 INFO Executor - Running task 0.0 in stage 206.0 (TID 262)
20:14:13.974 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:13.974 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:13.984 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.984 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.984 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:13.984 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:13.984 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:13.984 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:14.034 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014134114752450778082010_0983_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest5.someOtherPlace8541374047888609214/_temporary/0/task_202502102014134114752450778082010_0983_r_000000
20:14:14.034 INFO SparkHadoopMapRedUtil - attempt_202502102014134114752450778082010_0983_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:14.034 INFO Executor - Finished task 0.0 in stage 206.0 (TID 262). 1858 bytes result sent to driver
20:14:14.035 INFO TaskSetManager - Finished task 0.0 in stage 206.0 (TID 262) in 67 ms on localhost (executor driver) (1/1)
20:14:14.035 INFO TaskSchedulerImpl - Removed TaskSet 206.0, whose tasks have all completed, from pool
20:14:14.035 INFO DAGScheduler - ResultStage 206 (runJob at SparkHadoopWriter.scala:83) finished in 0.080 s
20:14:14.035 INFO DAGScheduler - Job 153 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.035 INFO TaskSchedulerImpl - Killing all running tasks in stage 206: Stage finished
20:14:14.035 INFO DAGScheduler - Job 153 finished: runJob at SparkHadoopWriter.scala:83, took 0.125883 s
20:14:14.035 INFO SparkHadoopWriter - Start to commit write Job job_202502102014134114752450778082010_0983.
20:14:14.040 INFO SparkHadoopWriter - Write Job job_202502102014134114752450778082010_0983 committed. Elapsed time: 4 ms.
20:14:14.053 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest510340376519741921477.cram
20:14:14.057 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest510340376519741921477.cram done
20:14:14.059 INFO MemoryStore - Block broadcast_412 stored as values in memory (estimated size 504.0 B, free 1918.7 MiB)
20:14:14.060 INFO MemoryStore - Block broadcast_412_piece0 stored as bytes in memory (estimated size 160.0 B, free 1918.7 MiB)
20:14:14.060 INFO BlockManagerInfo - Added broadcast_412_piece0 in memory on localhost:35739 (size: 160.0 B, free: 1919.7 MiB)
20:14:14.060 INFO SparkContext - Created broadcast 412 from broadcast at CramSource.java:114
20:14:14.062 INFO MemoryStore - Block broadcast_413 stored as values in memory (estimated size 297.9 KiB, free 1918.4 MiB)
20:14:14.072 INFO MemoryStore - Block broadcast_413_piece0 stored as bytes in memory (estimated size 50.1 KiB, free 1918.4 MiB)
20:14:14.072 INFO BlockManagerInfo - Added broadcast_413_piece0 in memory on localhost:35739 (size: 50.1 KiB, free: 1919.7 MiB)
20:14:14.072 INFO SparkContext - Created broadcast 413 from newAPIHadoopFile at PathSplitSource.java:96
20:14:14.092 INFO FileInputFormat - Total input files to process : 1
20:14:14.117 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:14.117 INFO DAGScheduler - Got job 154 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:14.117 INFO DAGScheduler - Final stage: ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:14.117 INFO DAGScheduler - Parents of final stage: List()
20:14:14.117 INFO DAGScheduler - Missing parents: List()
20:14:14.117 INFO DAGScheduler - Submitting ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:14.135 INFO MemoryStore - Block broadcast_414 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
20:14:14.136 INFO MemoryStore - Block broadcast_414_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
20:14:14.137 INFO BlockManagerInfo - Added broadcast_414_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.6 MiB)
20:14:14.137 INFO SparkContext - Created broadcast 414 from broadcast at DAGScheduler.scala:1580
20:14:14.137 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:14.137 INFO TaskSchedulerImpl - Adding task set 207.0 with 1 tasks resource profile 0
20:14:14.137 INFO TaskSetManager - Starting task 0.0 in stage 207.0 (TID 263) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
20:14:14.138 INFO Executor - Running task 0.0 in stage 207.0 (TID 263)
20:14:14.158 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest510340376519741921477.cram:0+43715
20:14:14.169 INFO Executor - Finished task 0.0 in stage 207.0 (TID 263). 154058 bytes result sent to driver
20:14:14.170 INFO TaskSetManager - Finished task 0.0 in stage 207.0 (TID 263) in 33 ms on localhost (executor driver) (1/1)
20:14:14.170 INFO TaskSchedulerImpl - Removed TaskSet 207.0, whose tasks have all completed, from pool
20:14:14.170 INFO DAGScheduler - ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.053 s
20:14:14.170 INFO DAGScheduler - Job 154 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.170 INFO TaskSchedulerImpl - Killing all running tasks in stage 207: Stage finished
20:14:14.170 INFO DAGScheduler - Job 154 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.053557 s
20:14:14.175 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:14.175 INFO DAGScheduler - Got job 155 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:14.175 INFO DAGScheduler - Final stage: ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185)
20:14:14.175 INFO DAGScheduler - Parents of final stage: List()
20:14:14.175 INFO DAGScheduler - Missing parents: List()
20:14:14.176 INFO DAGScheduler - Submitting ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:14.187 INFO MemoryStore - Block broadcast_415 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
20:14:14.188 INFO MemoryStore - Block broadcast_415_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
20:14:14.188 INFO BlockManagerInfo - Added broadcast_415_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.5 MiB)
20:14:14.188 INFO SparkContext - Created broadcast 415 from broadcast at DAGScheduler.scala:1580
20:14:14.188 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:14.188 INFO TaskSchedulerImpl - Adding task set 208.0 with 1 tasks resource profile 0
20:14:14.189 INFO TaskSetManager - Starting task 0.0 in stage 208.0 (TID 264) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
20:14:14.189 INFO Executor - Running task 0.0 in stage 208.0 (TID 264)
20:14:14.213 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:14.218 INFO Executor - Finished task 0.0 in stage 208.0 (TID 264). 989 bytes result sent to driver
20:14:14.219 INFO TaskSetManager - Finished task 0.0 in stage 208.0 (TID 264) in 30 ms on localhost (executor driver) (1/1)
20:14:14.219 INFO TaskSchedulerImpl - Removed TaskSet 208.0, whose tasks have all completed, from pool
20:14:14.219 INFO DAGScheduler - ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.043 s
20:14:14.219 INFO DAGScheduler - Job 155 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.219 INFO TaskSchedulerImpl - Killing all running tasks in stage 208: Stage finished
20:14:14.219 INFO DAGScheduler - Job 155 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.044035 s
20:14:14.223 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:14.223 INFO DAGScheduler - Got job 156 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:14.223 INFO DAGScheduler - Final stage: ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185)
20:14:14.223 INFO DAGScheduler - Parents of final stage: List()
20:14:14.223 INFO DAGScheduler - Missing parents: List()
20:14:14.223 INFO DAGScheduler - Submitting ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:14.234 INFO MemoryStore - Block broadcast_416 stored as values in memory (estimated size 286.8 KiB, free 1917.3 MiB)
20:14:14.235 INFO MemoryStore - Block broadcast_416_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.2 MiB)
20:14:14.235 INFO BlockManagerInfo - Added broadcast_416_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.4 MiB)
20:14:14.235 INFO SparkContext - Created broadcast 416 from broadcast at DAGScheduler.scala:1580
20:14:14.235 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:14.236 INFO TaskSchedulerImpl - Adding task set 209.0 with 1 tasks resource profile 0
20:14:14.236 INFO TaskSetManager - Starting task 0.0 in stage 209.0 (TID 265) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
20:14:14.236 INFO Executor - Running task 0.0 in stage 209.0 (TID 265)
20:14:14.256 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest510340376519741921477.cram:0+43715
20:14:14.273 INFO Executor - Finished task 0.0 in stage 209.0 (TID 265). 1075 bytes result sent to driver
20:14:14.273 INFO TaskSetManager - Finished task 0.0 in stage 209.0 (TID 265) in 37 ms on localhost (executor driver) (1/1)
20:14:14.273 INFO TaskSchedulerImpl - Removed TaskSet 209.0, whose tasks have all completed, from pool
20:14:14.274 INFO DAGScheduler - ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.051 s
20:14:14.274 INFO DAGScheduler - Job 156 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.274 INFO TaskSchedulerImpl - Killing all running tasks in stage 209: Stage finished
20:14:14.274 INFO BlockManagerInfo - Removed broadcast_410_piece0 on localhost:35739 in memory (size: 107.3 KiB, free: 1919.5 MiB)
20:14:14.274 INFO DAGScheduler - Job 156 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.051306 s
20:14:14.274 INFO BlockManagerInfo - Removed broadcast_409_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.5 MiB)
20:14:14.275 INFO BlockManagerInfo - Removed broadcast_406_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.5 MiB)
20:14:14.276 INFO BlockManagerInfo - Removed broadcast_415_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.6 MiB)
20:14:14.276 INFO BlockManagerInfo - Removed broadcast_408_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.6 MiB)
20:14:14.277 INFO BlockManagerInfo - Removed broadcast_411_piece0 on localhost:35739 in memory (size: 58.1 KiB, free: 1919.7 MiB)
20:14:14.279 INFO BlockManagerInfo - Removed broadcast_414_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.8 MiB)
20:14:14.279 INFO BlockManagerInfo - Removed broadcast_407_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:14.284 INFO MemoryStore - Block broadcast_417 stored as values in memory (estimated size 297.9 KiB, free 1918.6 MiB)
20:14:14.294 INFO MemoryStore - Block broadcast_417_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.6 MiB)
20:14:14.294 INFO BlockManagerInfo - Added broadcast_417_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:14:14.294 INFO SparkContext - Created broadcast 417 from newAPIHadoopFile at PathSplitSource.java:96
20:14:14.321 INFO MemoryStore - Block broadcast_418 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:14.327 INFO MemoryStore - Block broadcast_418_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.3 MiB)
20:14:14.327 INFO BlockManagerInfo - Added broadcast_418_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:14.327 INFO SparkContext - Created broadcast 418 from newAPIHadoopFile at PathSplitSource.java:96
20:14:14.346 INFO FileInputFormat - Total input files to process : 1
20:14:14.348 INFO MemoryStore - Block broadcast_419 stored as values in memory (estimated size 160.7 KiB, free 1918.1 MiB)
20:14:14.349 INFO MemoryStore - Block broadcast_419_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.1 MiB)
20:14:14.349 INFO BlockManagerInfo - Added broadcast_419_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.7 MiB)
20:14:14.349 INFO SparkContext - Created broadcast 419 from broadcast at ReadsSparkSink.java:133
20:14:14.352 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:14.352 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:14.352 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:14.369 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:14.369 INFO DAGScheduler - Registering RDD 1003 (mapToPair at SparkUtils.java:161) as input to shuffle 42
20:14:14.369 INFO DAGScheduler - Got job 157 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:14.369 INFO DAGScheduler - Final stage: ResultStage 211 (runJob at SparkHadoopWriter.scala:83)
20:14:14.369 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 210)
20:14:14.370 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 210)
20:14:14.370 INFO DAGScheduler - Submitting ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:14.387 INFO MemoryStore - Block broadcast_420 stored as values in memory (estimated size 520.4 KiB, free 1917.6 MiB)
20:14:14.388 INFO MemoryStore - Block broadcast_420_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.4 MiB)
20:14:14.388 INFO BlockManagerInfo - Added broadcast_420_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.5 MiB)
20:14:14.388 INFO SparkContext - Created broadcast 420 from broadcast at DAGScheduler.scala:1580
20:14:14.389 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:14.389 INFO TaskSchedulerImpl - Adding task set 210.0 with 1 tasks resource profile 0
20:14:14.389 INFO TaskSetManager - Starting task 0.0 in stage 210.0 (TID 266) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:14.389 INFO Executor - Running task 0.0 in stage 210.0 (TID 266)
20:14:14.419 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:14.433 INFO Executor - Finished task 0.0 in stage 210.0 (TID 266). 1148 bytes result sent to driver
20:14:14.433 INFO TaskSetManager - Finished task 0.0 in stage 210.0 (TID 266) in 44 ms on localhost (executor driver) (1/1)
20:14:14.434 INFO TaskSchedulerImpl - Removed TaskSet 210.0, whose tasks have all completed, from pool
20:14:14.434 INFO DAGScheduler - ShuffleMapStage 210 (mapToPair at SparkUtils.java:161) finished in 0.064 s
20:14:14.434 INFO DAGScheduler - looking for newly runnable stages
20:14:14.434 INFO DAGScheduler - running: HashSet()
20:14:14.434 INFO DAGScheduler - waiting: HashSet(ResultStage 211)
20:14:14.434 INFO DAGScheduler - failed: HashSet()
20:14:14.434 INFO DAGScheduler - Submitting ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65), which has no missing parents
20:14:14.441 INFO MemoryStore - Block broadcast_421 stored as values in memory (estimated size 241.1 KiB, free 1917.2 MiB)
20:14:14.441 INFO MemoryStore - Block broadcast_421_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1917.1 MiB)
20:14:14.441 INFO BlockManagerInfo - Added broadcast_421_piece0 in memory on localhost:35739 (size: 66.9 KiB, free: 1919.5 MiB)
20:14:14.442 INFO SparkContext - Created broadcast 421 from broadcast at DAGScheduler.scala:1580
20:14:14.442 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
20:14:14.442 INFO TaskSchedulerImpl - Adding task set 211.0 with 1 tasks resource profile 0
20:14:14.442 INFO TaskSetManager - Starting task 0.0 in stage 211.0 (TID 267) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:14.442 INFO Executor - Running task 0.0 in stage 211.0 (TID 267)
20:14:14.447 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:14.448 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:14.458 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:14.458 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:14.458 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:14.475 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014148227585383234082461_1009_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest6.someOtherPlace10109887885747698106/_temporary/0/task_202502102014148227585383234082461_1009_m_000000
20:14:14.475 INFO SparkHadoopMapRedUtil - attempt_202502102014148227585383234082461_1009_m_000000_0: Committed. Elapsed time: 0 ms.
20:14:14.475 INFO Executor - Finished task 0.0 in stage 211.0 (TID 267). 1858 bytes result sent to driver
20:14:14.476 INFO TaskSetManager - Finished task 0.0 in stage 211.0 (TID 267) in 34 ms on localhost (executor driver) (1/1)
20:14:14.476 INFO TaskSchedulerImpl - Removed TaskSet 211.0, whose tasks have all completed, from pool
20:14:14.476 INFO DAGScheduler - ResultStage 211 (runJob at SparkHadoopWriter.scala:83) finished in 0.042 s
20:14:14.476 INFO DAGScheduler - Job 157 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.476 INFO TaskSchedulerImpl - Killing all running tasks in stage 211: Stage finished
20:14:14.476 INFO DAGScheduler - Job 157 finished: runJob at SparkHadoopWriter.scala:83, took 0.106962 s
20:14:14.476 INFO SparkHadoopWriter - Start to commit write Job job_202502102014148227585383234082461_1009.
20:14:14.480 INFO SparkHadoopWriter - Write Job job_202502102014148227585383234082461_1009 committed. Elapsed time: 4 ms.
20:14:14.488 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest616305205788985006533.sam
20:14:14.493 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest616305205788985006533.sam done
WARNING 2025-02-10 20:14:14 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-02-10 20:14:14 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:14.496 INFO MemoryStore - Block broadcast_422 stored as values in memory (estimated size 160.7 KiB, free 1917.0 MiB)
20:14:14.496 INFO MemoryStore - Block broadcast_422_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.0 MiB)
20:14:14.497 INFO BlockManagerInfo - Added broadcast_422_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:14:14.497 INFO SparkContext - Created broadcast 422 from broadcast at SamSource.java:78
20:14:14.497 INFO MemoryStore - Block broadcast_423 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
20:14:14.504 INFO MemoryStore - Block broadcast_423_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
20:14:14.504 INFO BlockManagerInfo - Added broadcast_423_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.4 MiB)
20:14:14.504 INFO SparkContext - Created broadcast 423 from newAPIHadoopFile at SamSource.java:108
20:14:14.506 INFO FileInputFormat - Total input files to process : 1
20:14:14.509 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:14.510 INFO DAGScheduler - Got job 158 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:14.510 INFO DAGScheduler - Final stage: ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:14.510 INFO DAGScheduler - Parents of final stage: List()
20:14:14.510 INFO DAGScheduler - Missing parents: List()
20:14:14.510 INFO DAGScheduler - Submitting ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:14.510 INFO MemoryStore - Block broadcast_424 stored as values in memory (estimated size 7.5 KiB, free 1916.6 MiB)
20:14:14.511 INFO MemoryStore - Block broadcast_424_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1916.6 MiB)
20:14:14.511 INFO BlockManagerInfo - Added broadcast_424_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.4 MiB)
20:14:14.511 INFO SparkContext - Created broadcast 424 from broadcast at DAGScheduler.scala:1580
20:14:14.511 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:14.511 INFO TaskSchedulerImpl - Adding task set 212.0 with 1 tasks resource profile 0
20:14:14.511 INFO TaskSetManager - Starting task 0.0 in stage 212.0 (TID 268) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
20:14:14.512 INFO Executor - Running task 0.0 in stage 212.0 (TID 268)
20:14:14.513 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest616305205788985006533.sam:0+847558
20:14:14.524 INFO Executor - Finished task 0.0 in stage 212.0 (TID 268). 651483 bytes result sent to driver
20:14:14.526 INFO TaskSetManager - Finished task 0.0 in stage 212.0 (TID 268) in 15 ms on localhost (executor driver) (1/1)
20:14:14.526 INFO TaskSchedulerImpl - Removed TaskSet 212.0, whose tasks have all completed, from pool
20:14:14.526 INFO DAGScheduler - ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.016 s
20:14:14.526 INFO DAGScheduler - Job 158 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.526 INFO TaskSchedulerImpl - Killing all running tasks in stage 212: Stage finished
20:14:14.526 INFO DAGScheduler - Job 158 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.016676 s
20:14:14.535 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:14.535 INFO DAGScheduler - Got job 159 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:14.535 INFO DAGScheduler - Final stage: ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185)
20:14:14.535 INFO DAGScheduler - Parents of final stage: List()
20:14:14.535 INFO DAGScheduler - Missing parents: List()
20:14:14.535 INFO DAGScheduler - Submitting ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:14.552 INFO MemoryStore - Block broadcast_425 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
20:14:14.553 INFO MemoryStore - Block broadcast_425_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.0 MiB)
20:14:14.553 INFO BlockManagerInfo - Added broadcast_425_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.3 MiB)
20:14:14.554 INFO SparkContext - Created broadcast 425 from broadcast at DAGScheduler.scala:1580
20:14:14.554 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:14.554 INFO TaskSchedulerImpl - Adding task set 213.0 with 1 tasks resource profile 0
20:14:14.554 INFO TaskSetManager - Starting task 0.0 in stage 213.0 (TID 269) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:14.555 INFO Executor - Running task 0.0 in stage 213.0 (TID 269)
20:14:14.588 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:14.597 INFO Executor - Finished task 0.0 in stage 213.0 (TID 269). 989 bytes result sent to driver
20:14:14.597 INFO TaskSetManager - Finished task 0.0 in stage 213.0 (TID 269) in 43 ms on localhost (executor driver) (1/1)
20:14:14.597 INFO TaskSchedulerImpl - Removed TaskSet 213.0, whose tasks have all completed, from pool
20:14:14.597 INFO DAGScheduler - ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.061 s
20:14:14.597 INFO DAGScheduler - Job 159 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.597 INFO TaskSchedulerImpl - Killing all running tasks in stage 213: Stage finished
20:14:14.597 INFO DAGScheduler - Job 159 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062270 s
20:14:14.601 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:14.601 INFO DAGScheduler - Got job 160 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:14.601 INFO DAGScheduler - Final stage: ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185)
20:14:14.601 INFO DAGScheduler - Parents of final stage: List()
20:14:14.601 INFO DAGScheduler - Missing parents: List()
20:14:14.601 INFO DAGScheduler - Submitting ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:14.602 INFO MemoryStore - Block broadcast_426 stored as values in memory (estimated size 7.4 KiB, free 1916.0 MiB)
20:14:14.602 INFO MemoryStore - Block broadcast_426_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1916.0 MiB)
20:14:14.602 INFO BlockManagerInfo - Added broadcast_426_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.2 MiB)
20:14:14.602 INFO SparkContext - Created broadcast 426 from broadcast at DAGScheduler.scala:1580
20:14:14.602 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:14.602 INFO TaskSchedulerImpl - Adding task set 214.0 with 1 tasks resource profile 0
20:14:14.603 INFO TaskSetManager - Starting task 0.0 in stage 214.0 (TID 270) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
20:14:14.603 INFO Executor - Running task 0.0 in stage 214.0 (TID 270)
20:14:14.604 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest616305205788985006533.sam:0+847558
20:14:14.612 INFO Executor - Finished task 0.0 in stage 214.0 (TID 270). 946 bytes result sent to driver
20:14:14.612 INFO TaskSetManager - Finished task 0.0 in stage 214.0 (TID 270) in 9 ms on localhost (executor driver) (1/1)
20:14:14.612 INFO TaskSchedulerImpl - Removed TaskSet 214.0, whose tasks have all completed, from pool
20:14:14.612 INFO DAGScheduler - ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.011 s
20:14:14.612 INFO DAGScheduler - Job 160 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:14.612 INFO TaskSchedulerImpl - Killing all running tasks in stage 214: Stage finished
20:14:14.612 INFO DAGScheduler - Job 160 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.011597 s
20:14:14.625 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:14.626 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:14.627 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:14.627 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:14.630 INFO MemoryStore - Block broadcast_427 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
20:14:14.636 INFO MemoryStore - Block broadcast_427_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.7 MiB)
20:14:14.636 INFO BlockManagerInfo - Added broadcast_427_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:14.636 INFO SparkContext - Created broadcast 427 from newAPIHadoopFile at PathSplitSource.java:96
20:14:14.673 INFO MemoryStore - Block broadcast_428 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
20:14:14.680 INFO MemoryStore - Block broadcast_428_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
20:14:14.680 INFO BlockManagerInfo - Added broadcast_428_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:14.681 INFO SparkContext - Created broadcast 428 from newAPIHadoopFile at PathSplitSource.java:96
20:14:14.700 INFO FileInputFormat - Total input files to process : 1
20:14:14.702 INFO MemoryStore - Block broadcast_429 stored as values in memory (estimated size 160.7 KiB, free 1915.2 MiB)
20:14:14.707 INFO BlockManagerInfo - Removed broadcast_420_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.3 MiB)
20:14:14.707 INFO MemoryStore - Block broadcast_429_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.8 MiB)
20:14:14.707 INFO BlockManagerInfo - Added broadcast_429_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:14.707 INFO SparkContext - Created broadcast 429 from broadcast at ReadsSparkSink.java:133
20:14:14.707 INFO BlockManagerInfo - Removed broadcast_423_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:14.708 INFO BlockManagerInfo - Removed broadcast_419_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:14.708 INFO BlockManagerInfo - Removed broadcast_424_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.4 MiB)
20:14:14.709 INFO MemoryStore - Block broadcast_430 stored as values in memory (estimated size 163.2 KiB, free 1916.4 MiB)
20:14:14.709 INFO MemoryStore - Block broadcast_430_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.3 MiB)
20:14:14.709 INFO BlockManagerInfo - Removed broadcast_425_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.5 MiB)
20:14:14.710 INFO BlockManagerInfo - Added broadcast_430_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:14:14.710 INFO SparkContext - Created broadcast 430 from broadcast at BamSink.java:76
20:14:14.711 INFO BlockManagerInfo - Removed broadcast_404_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.5 MiB)
20:14:14.711 INFO BlockManagerInfo - Removed broadcast_428_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:14.712 INFO BlockManagerInfo - Removed broadcast_421_piece0 on localhost:35739 in memory (size: 66.9 KiB, free: 1919.6 MiB)
20:14:14.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts dst=null perm=null proto=rpc
20:14:14.712 INFO BlockManagerInfo - Removed broadcast_412_piece0 on localhost:35739 in memory (size: 160.0 B, free: 1919.6 MiB)
20:14:14.713 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:14.713 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:14.713 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:14.713 INFO BlockManagerInfo - Removed broadcast_417_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:14.713 INFO BlockManagerInfo - Removed broadcast_426_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.7 MiB)
20:14:14.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:14.714 INFO BlockManagerInfo - Removed broadcast_422_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:14:14.714 INFO BlockManagerInfo - Removed broadcast_416_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.8 MiB)
20:14:14.714 INFO BlockManagerInfo - Removed broadcast_413_piece0 on localhost:35739 in memory (size: 50.1 KiB, free: 1919.8 MiB)
20:14:14.715 INFO BlockManagerInfo - Removed broadcast_418_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:14.715 INFO BlockManagerInfo - Removed broadcast_405_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:14.720 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:14.721 INFO DAGScheduler - Registering RDD 1028 (mapToPair at SparkUtils.java:161) as input to shuffle 43
20:14:14.721 INFO DAGScheduler - Got job 161 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:14.721 INFO DAGScheduler - Final stage: ResultStage 216 (runJob at SparkHadoopWriter.scala:83)
20:14:14.721 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 215)
20:14:14.721 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 215)
20:14:14.721 INFO DAGScheduler - Submitting ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:14.739 INFO MemoryStore - Block broadcast_431 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
20:14:14.740 INFO MemoryStore - Block broadcast_431_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
20:14:14.740 INFO BlockManagerInfo - Added broadcast_431_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.8 MiB)
20:14:14.741 INFO SparkContext - Created broadcast 431 from broadcast at DAGScheduler.scala:1580
20:14:14.741 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:14.741 INFO TaskSchedulerImpl - Adding task set 215.0 with 1 tasks resource profile 0
20:14:14.741 INFO TaskSetManager - Starting task 0.0 in stage 215.0 (TID 271) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:14.742 INFO Executor - Running task 0.0 in stage 215.0 (TID 271)
20:14:14.771 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:14.793 INFO Executor - Finished task 0.0 in stage 215.0 (TID 271). 1148 bytes result sent to driver
20:14:14.794 INFO TaskSetManager - Finished task 0.0 in stage 215.0 (TID 271) in 53 ms on localhost (executor driver) (1/1)
20:14:14.794 INFO TaskSchedulerImpl - Removed TaskSet 215.0, whose tasks have all completed, from pool
20:14:14.794 INFO DAGScheduler - ShuffleMapStage 215 (mapToPair at SparkUtils.java:161) finished in 0.073 s
20:14:14.794 INFO DAGScheduler - looking for newly runnable stages
20:14:14.794 INFO DAGScheduler - running: HashSet()
20:14:14.794 INFO DAGScheduler - waiting: HashSet(ResultStage 216)
20:14:14.794 INFO DAGScheduler - failed: HashSet()
20:14:14.795 INFO DAGScheduler - Submitting ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91), which has no missing parents
20:14:14.802 INFO MemoryStore - Block broadcast_432 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
20:14:14.802 INFO MemoryStore - Block broadcast_432_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
20:14:14.803 INFO BlockManagerInfo - Added broadcast_432_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.7 MiB)
20:14:14.803 INFO SparkContext - Created broadcast 432 from broadcast at DAGScheduler.scala:1580
20:14:14.803 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:14.803 INFO TaskSchedulerImpl - Adding task set 216.0 with 1 tasks resource profile 0
20:14:14.804 INFO TaskSetManager - Starting task 0.0 in stage 216.0 (TID 272) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:14.804 INFO Executor - Running task 0.0 in stage 216.0 (TID 272)
20:14:14.808 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:14.808 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:14.819 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:14.819 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:14.819 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:14.819 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:14.819 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:14.819 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:14.821 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:14.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:14.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:14.824 INFO StateChange - BLOCK* allocate blk_1073741871_1047, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/part-r-00000
20:14:14.825 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741871_1047 src: /127.0.0.1:49972 dest: /127.0.0.1:38353
20:14:14.828 INFO clienttrace - src: /127.0.0.1:49972, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741871_1047, duration(ns): 1522120
20:14:14.828 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741871_1047, type=LAST_IN_PIPELINE terminating
20:14:14.829 INFO FSNamesystem - BLOCK* blk_1073741871_1047 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/part-r-00000
20:14:15.229 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:15.230 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:14:15.231 INFO StateChange - BLOCK* allocate blk_1073741872_1048, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/.part-r-00000.sbi
20:14:15.232 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741872_1048 src: /127.0.0.1:49980 dest: /127.0.0.1:38353
20:14:15.233 INFO clienttrace - src: /127.0.0.1:49980, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741872_1048, duration(ns): 459165
20:14:15.233 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741872_1048, type=LAST_IN_PIPELINE terminating
20:14:15.233 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:15.235 INFO StateChange - BLOCK* allocate blk_1073741873_1049, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/.part-r-00000.bai
20:14:15.236 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741873_1049 src: /127.0.0.1:49994 dest: /127.0.0.1:38353
20:14:15.237 INFO clienttrace - src: /127.0.0.1:49994, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741873_1049, duration(ns): 359563
20:14:15.237 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741873_1049, type=LAST_IN_PIPELINE terminating
20:14:15.237 INFO FSNamesystem - BLOCK* blk_1073741873_1049 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/.part-r-00000.bai
20:14:15.638 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:15.638 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0 dst=null perm=null proto=rpc
20:14:15.639 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0 dst=null perm=null proto=rpc
20:14:15.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/task_202502102014143103266549059131655_1033_r_000000 dst=null perm=null proto=rpc
20:14:15.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/_temporary/attempt_202502102014143103266549059131655_1033_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/task_202502102014143103266549059131655_1033_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:15.641 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014143103266549059131655_1033_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/task_202502102014143103266549059131655_1033_r_000000
20:14:15.641 INFO SparkHadoopMapRedUtil - attempt_202502102014143103266549059131655_1033_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:15.641 INFO Executor - Finished task 0.0 in stage 216.0 (TID 272). 1858 bytes result sent to driver
20:14:15.641 INFO TaskSetManager - Finished task 0.0 in stage 216.0 (TID 272) in 838 ms on localhost (executor driver) (1/1)
20:14:15.641 INFO TaskSchedulerImpl - Removed TaskSet 216.0, whose tasks have all completed, from pool
20:14:15.642 INFO DAGScheduler - ResultStage 216 (runJob at SparkHadoopWriter.scala:83) finished in 0.847 s
20:14:15.642 INFO DAGScheduler - Job 161 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:15.642 INFO TaskSchedulerImpl - Killing all running tasks in stage 216: Stage finished
20:14:15.642 INFO DAGScheduler - Job 161 finished: runJob at SparkHadoopWriter.scala:83, took 0.921343 s
20:14:15.642 INFO SparkHadoopWriter - Start to commit write Job job_202502102014143103266549059131655_1033.
20:14:15.643 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:15.643 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts dst=null perm=null proto=rpc
20:14:15.644 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/task_202502102014143103266549059131655_1033_r_000000 dst=null perm=null proto=rpc
20:14:15.644 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:15.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/task_202502102014143103266549059131655_1033_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:15.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:15.646 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/task_202502102014143103266549059131655_1033_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:15.646 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:15.647 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary/0/task_202502102014143103266549059131655_1033_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:15.648 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_temporary dst=null perm=null proto=rpc
20:14:15.648 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:15.649 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:15.650 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.spark-staging-1033 dst=null perm=null proto=rpc
20:14:15.650 INFO SparkHadoopWriter - Write Job job_202502102014143103266549059131655_1033 committed. Elapsed time: 7 ms.
20:14:15.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:15.652 INFO StateChange - BLOCK* allocate blk_1073741874_1050, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/header
20:14:15.653 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741874_1050 src: /127.0.0.1:50010 dest: /127.0.0.1:38353
20:14:15.654 INFO clienttrace - src: /127.0.0.1:50010, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741874_1050, duration(ns): 496370
20:14:15.654 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741874_1050, type=LAST_IN_PIPELINE terminating
20:14:15.655 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:15.656 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:15.657 INFO StateChange - BLOCK* allocate blk_1073741875_1051, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/terminator
20:14:15.658 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741875_1051 src: /127.0.0.1:50020 dest: /127.0.0.1:38353
20:14:15.659 INFO clienttrace - src: /127.0.0.1:50020, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741875_1051, duration(ns): 350953
20:14:15.659 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741875_1051, type=LAST_IN_PIPELINE terminating
20:14:15.659 INFO FSNamesystem - BLOCK* blk_1073741875_1051 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/terminator
20:14:16.060 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:16.061 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts dst=null perm=null proto=rpc
20:14:16.062 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.063 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:16.063 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam
20:14:16.064 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.064 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.065 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.065 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.065 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam done
20:14:16.066 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.066 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi
20:14:16.066 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts dst=null perm=null proto=rpc
20:14:16.067 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.068 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:16.068 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:16.070 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:16.070 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:16.071 INFO StateChange - BLOCK* allocate blk_1073741876_1052, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi
20:14:16.071 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741876_1052 src: /127.0.0.1:50032 dest: /127.0.0.1:38353
20:14:16.073 INFO clienttrace - src: /127.0.0.1:50032, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741876_1052, duration(ns): 491453
20:14:16.073 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741876_1052, type=LAST_IN_PIPELINE terminating
20:14:16.073 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:16.074 INFO IndexFileMerger - Done merging .sbi files
20:14:16.074 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai
20:14:16.074 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts dst=null perm=null proto=rpc
20:14:16.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:16.076 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:16.077 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:16.077 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:16.078 INFO StateChange - BLOCK* allocate blk_1073741877_1053, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai
20:14:16.079 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741877_1053 src: /127.0.0.1:50042 dest: /127.0.0.1:38353
20:14:16.080 INFO clienttrace - src: /127.0.0.1:50042, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741877_1053, duration(ns): 507892
20:14:16.080 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741877_1053, type=LAST_IN_PIPELINE terminating
20:14:16.081 INFO FSNamesystem - BLOCK* blk_1073741877_1053 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai
20:14:16.481 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:16.482 INFO IndexFileMerger - Done merging .bai files
20:14:16.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.parts dst=null perm=null proto=rpc
20:14:16.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi dst=null perm=null proto=rpc
20:14:16.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi dst=null perm=null proto=rpc
20:14:16.499 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi dst=null perm=null proto=rpc
20:14:16.500 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:16.500 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.501 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.501 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.501 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.504 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:16.507 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:16.507 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:16.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi dst=null perm=null proto=rpc
20:14:16.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi dst=null perm=null proto=rpc
20:14:16.508 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.sbi dst=null perm=null proto=rpc
20:14:16.509 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:16.509 INFO MemoryStore - Block broadcast_433 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
20:14:16.509 INFO MemoryStore - Block broadcast_433_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
20:14:16.509 INFO BlockManagerInfo - Added broadcast_433_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.7 MiB)
20:14:16.510 INFO SparkContext - Created broadcast 433 from broadcast at BamSource.java:104
20:14:16.510 INFO MemoryStore - Block broadcast_434 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:16.516 INFO MemoryStore - Block broadcast_434_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:16.517 INFO BlockManagerInfo - Added broadcast_434_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:16.517 INFO SparkContext - Created broadcast 434 from newAPIHadoopFile at PathSplitSource.java:96
20:14:16.525 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.526 INFO FileInputFormat - Total input files to process : 1
20:14:16.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.540 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:16.541 INFO DAGScheduler - Got job 162 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:16.541 INFO DAGScheduler - Final stage: ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:16.541 INFO DAGScheduler - Parents of final stage: List()
20:14:16.541 INFO DAGScheduler - Missing parents: List()
20:14:16.541 INFO DAGScheduler - Submitting ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:16.551 INFO MemoryStore - Block broadcast_435 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
20:14:16.552 INFO MemoryStore - Block broadcast_435_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
20:14:16.552 INFO BlockManagerInfo - Added broadcast_435_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:16.552 INFO SparkContext - Created broadcast 435 from broadcast at DAGScheduler.scala:1580
20:14:16.552 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:16.552 INFO TaskSchedulerImpl - Adding task set 217.0 with 1 tasks resource profile 0
20:14:16.553 INFO TaskSetManager - Starting task 0.0 in stage 217.0 (TID 273) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:16.553 INFO Executor - Running task 0.0 in stage 217.0 (TID 273)
20:14:16.569 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam:0+237038
20:14:16.570 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.571 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.572 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.572 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.572 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.574 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:16.576 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:16.576 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:16.578 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:16.578 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:16.581 INFO Executor - Finished task 0.0 in stage 217.0 (TID 273). 651526 bytes result sent to driver
20:14:16.583 INFO TaskSetManager - Finished task 0.0 in stage 217.0 (TID 273) in 31 ms on localhost (executor driver) (1/1)
20:14:16.583 INFO TaskSchedulerImpl - Removed TaskSet 217.0, whose tasks have all completed, from pool
20:14:16.583 INFO DAGScheduler - ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.042 s
20:14:16.583 INFO DAGScheduler - Job 162 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:16.584 INFO TaskSchedulerImpl - Killing all running tasks in stage 217: Stage finished
20:14:16.584 INFO DAGScheduler - Job 162 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.043192 s
20:14:16.593 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:16.593 INFO DAGScheduler - Got job 163 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:16.593 INFO DAGScheduler - Final stage: ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185)
20:14:16.593 INFO DAGScheduler - Parents of final stage: List()
20:14:16.593 INFO DAGScheduler - Missing parents: List()
20:14:16.593 INFO DAGScheduler - Submitting ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:16.615 INFO MemoryStore - Block broadcast_436 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:14:16.617 INFO MemoryStore - Block broadcast_436_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
20:14:16.617 INFO BlockManagerInfo - Added broadcast_436_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:16.617 INFO SparkContext - Created broadcast 436 from broadcast at DAGScheduler.scala:1580
20:14:16.617 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:16.617 INFO TaskSchedulerImpl - Adding task set 218.0 with 1 tasks resource profile 0
20:14:16.618 INFO TaskSetManager - Starting task 0.0 in stage 218.0 (TID 274) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:16.618 INFO Executor - Running task 0.0 in stage 218.0 (TID 274)
20:14:16.647 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:16.656 INFO Executor - Finished task 0.0 in stage 218.0 (TID 274). 989 bytes result sent to driver
20:14:16.657 INFO TaskSetManager - Finished task 0.0 in stage 218.0 (TID 274) in 39 ms on localhost (executor driver) (1/1)
20:14:16.657 INFO TaskSchedulerImpl - Removed TaskSet 218.0, whose tasks have all completed, from pool
20:14:16.657 INFO DAGScheduler - ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
20:14:16.657 INFO DAGScheduler - Job 163 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:16.657 INFO TaskSchedulerImpl - Killing all running tasks in stage 218: Stage finished
20:14:16.657 INFO DAGScheduler - Job 163 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.064129 s
20:14:16.662 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:16.662 INFO DAGScheduler - Got job 164 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:16.662 INFO DAGScheduler - Final stage: ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185)
20:14:16.662 INFO DAGScheduler - Parents of final stage: List()
20:14:16.662 INFO DAGScheduler - Missing parents: List()
20:14:16.662 INFO DAGScheduler - Submitting ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:16.668 INFO MemoryStore - Block broadcast_437 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
20:14:16.669 INFO MemoryStore - Block broadcast_437_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
20:14:16.669 INFO BlockManagerInfo - Added broadcast_437_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.4 MiB)
20:14:16.669 INFO SparkContext - Created broadcast 437 from broadcast at DAGScheduler.scala:1580
20:14:16.670 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:16.670 INFO TaskSchedulerImpl - Adding task set 219.0 with 1 tasks resource profile 0
20:14:16.670 INFO TaskSetManager - Starting task 0.0 in stage 219.0 (TID 275) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:16.670 INFO Executor - Running task 0.0 in stage 219.0 (TID 275)
20:14:16.682 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam:0+237038
20:14:16.682 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.683 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam dst=null perm=null proto=rpc
20:14:16.684 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.684 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.685 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_8bf25915-8218-4a40-bddb-91af0e21848e.bam.bai dst=null perm=null proto=rpc
20:14:16.687 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:16.688 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:16.689 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:16.689 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:16.691 INFO Executor - Finished task 0.0 in stage 219.0 (TID 275). 989 bytes result sent to driver
20:14:16.691 INFO TaskSetManager - Finished task 0.0 in stage 219.0 (TID 275) in 21 ms on localhost (executor driver) (1/1)
20:14:16.691 INFO TaskSchedulerImpl - Removed TaskSet 219.0, whose tasks have all completed, from pool
20:14:16.691 INFO DAGScheduler - ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.029 s
20:14:16.691 INFO DAGScheduler - Job 164 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:16.691 INFO TaskSchedulerImpl - Killing all running tasks in stage 219: Stage finished
20:14:16.691 INFO DAGScheduler - Job 164 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.029553 s
20:14:16.700 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:16.701 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.702 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:16.702 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:16.705 INFO MemoryStore - Block broadcast_438 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
20:14:16.711 INFO MemoryStore - Block broadcast_438_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:14:16.711 INFO BlockManagerInfo - Added broadcast_438_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:16.711 INFO SparkContext - Created broadcast 438 from newAPIHadoopFile at PathSplitSource.java:96
20:14:16.733 INFO MemoryStore - Block broadcast_439 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:14:16.739 INFO MemoryStore - Block broadcast_439_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:16.739 INFO BlockManagerInfo - Added broadcast_439_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:16.739 INFO SparkContext - Created broadcast 439 from newAPIHadoopFile at PathSplitSource.java:96
20:14:16.759 INFO FileInputFormat - Total input files to process : 1
20:14:16.761 INFO MemoryStore - Block broadcast_440 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
20:14:16.761 INFO MemoryStore - Block broadcast_440_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
20:14:16.762 INFO BlockManagerInfo - Added broadcast_440_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:16.762 INFO SparkContext - Created broadcast 440 from broadcast at ReadsSparkSink.java:133
20:14:16.763 INFO MemoryStore - Block broadcast_441 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
20:14:16.764 INFO MemoryStore - Block broadcast_441_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
20:14:16.764 INFO BlockManagerInfo - Added broadcast_441_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:16.764 INFO SparkContext - Created broadcast 441 from broadcast at BamSink.java:76
20:14:16.766 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts dst=null perm=null proto=rpc
20:14:16.766 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:16.766 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:16.766 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:16.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:16.773 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:16.773 INFO DAGScheduler - Registering RDD 1053 (mapToPair at SparkUtils.java:161) as input to shuffle 44
20:14:16.773 INFO DAGScheduler - Got job 165 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:16.773 INFO DAGScheduler - Final stage: ResultStage 221 (runJob at SparkHadoopWriter.scala:83)
20:14:16.773 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 220)
20:14:16.773 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 220)
20:14:16.774 INFO DAGScheduler - Submitting ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:16.791 INFO MemoryStore - Block broadcast_442 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
20:14:16.795 INFO BlockManagerInfo - Removed broadcast_439_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:16.795 INFO BlockManagerInfo - Removed broadcast_427_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:16.796 INFO BlockManagerInfo - Removed broadcast_433_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.4 MiB)
20:14:16.796 INFO MemoryStore - Block broadcast_442_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
20:14:16.796 INFO BlockManagerInfo - Added broadcast_442_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.2 MiB)
20:14:16.796 INFO SparkContext - Created broadcast 442 from broadcast at DAGScheduler.scala:1580
20:14:16.796 INFO BlockManagerInfo - Removed broadcast_431_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.4 MiB)
20:14:16.796 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:16.796 INFO TaskSchedulerImpl - Adding task set 220.0 with 1 tasks resource profile 0
20:14:16.797 INFO BlockManagerInfo - Removed broadcast_435_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.4 MiB)
20:14:16.797 INFO TaskSetManager - Starting task 0.0 in stage 220.0 (TID 276) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:16.797 INFO Executor - Running task 0.0 in stage 220.0 (TID 276)
20:14:16.798 INFO BlockManagerInfo - Removed broadcast_429_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:16.798 INFO BlockManagerInfo - Removed broadcast_430_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:16.798 INFO BlockManagerInfo - Removed broadcast_436_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.6 MiB)
20:14:16.799 INFO BlockManagerInfo - Removed broadcast_437_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.7 MiB)
20:14:16.801 INFO BlockManagerInfo - Removed broadcast_434_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:16.802 INFO BlockManagerInfo - Removed broadcast_432_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.8 MiB)
20:14:16.832 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:16.848 INFO Executor - Finished task 0.0 in stage 220.0 (TID 276). 1148 bytes result sent to driver
20:14:16.848 INFO TaskSetManager - Finished task 0.0 in stage 220.0 (TID 276) in 51 ms on localhost (executor driver) (1/1)
20:14:16.848 INFO TaskSchedulerImpl - Removed TaskSet 220.0, whose tasks have all completed, from pool
20:14:16.849 INFO DAGScheduler - ShuffleMapStage 220 (mapToPair at SparkUtils.java:161) finished in 0.075 s
20:14:16.849 INFO DAGScheduler - looking for newly runnable stages
20:14:16.849 INFO DAGScheduler - running: HashSet()
20:14:16.849 INFO DAGScheduler - waiting: HashSet(ResultStage 221)
20:14:16.849 INFO DAGScheduler - failed: HashSet()
20:14:16.849 INFO DAGScheduler - Submitting ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91), which has no missing parents
20:14:16.857 INFO MemoryStore - Block broadcast_443 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
20:14:16.858 INFO MemoryStore - Block broadcast_443_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
20:14:16.858 INFO BlockManagerInfo - Added broadcast_443_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.7 MiB)
20:14:16.858 INFO SparkContext - Created broadcast 443 from broadcast at DAGScheduler.scala:1580
20:14:16.859 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:16.859 INFO TaskSchedulerImpl - Adding task set 221.0 with 1 tasks resource profile 0
20:14:16.859 INFO TaskSetManager - Starting task 0.0 in stage 221.0 (TID 277) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:16.859 INFO Executor - Running task 0.0 in stage 221.0 (TID 277)
20:14:16.864 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:16.864 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:16.875 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:16.875 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:16.875 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:16.875 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:16.875 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:16.875 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:16.876 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.877 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.878 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:16.881 INFO StateChange - BLOCK* allocate blk_1073741878_1054, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/part-r-00000
20:14:16.882 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741878_1054 src: /127.0.0.1:50072 dest: /127.0.0.1:38353
20:14:16.884 INFO clienttrace - src: /127.0.0.1:50072, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741878_1054, duration(ns): 1046166
20:14:16.884 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741878_1054, type=LAST_IN_PIPELINE terminating
20:14:16.884 INFO FSNamesystem - BLOCK* blk_1073741878_1054 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/part-r-00000
20:14:17.285 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:17.286 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:14:17.287 INFO StateChange - BLOCK* allocate blk_1073741879_1055, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.sbi
20:14:17.287 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741879_1055 src: /127.0.0.1:50074 dest: /127.0.0.1:38353
20:14:17.288 INFO clienttrace - src: /127.0.0.1:50074, dest: /127.0.0.1:38353, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741879_1055, duration(ns): 448580
20:14:17.288 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741879_1055, type=LAST_IN_PIPELINE terminating
20:14:17.289 INFO FSNamesystem - BLOCK* blk_1073741879_1055 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.sbi
20:14:17.690 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:17.692 INFO StateChange - BLOCK* allocate blk_1073741880_1056, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.bai
20:14:17.693 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741880_1056 src: /127.0.0.1:34998 dest: /127.0.0.1:38353
20:14:17.695 INFO clienttrace - src: /127.0.0.1:34998, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741880_1056, duration(ns): 586358
20:14:17.695 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741880_1056, type=LAST_IN_PIPELINE terminating
20:14:17.695 INFO FSNamesystem - BLOCK* blk_1073741880_1056 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.bai
20:14:18.096 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:18.097 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0 dst=null perm=null proto=rpc
20:14:18.098 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0 dst=null perm=null proto=rpc
20:14:18.099 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/task_202502102014161013835853950711640_1058_r_000000 dst=null perm=null proto=rpc
20:14:18.099 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/_temporary/attempt_202502102014161013835853950711640_1058_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/task_202502102014161013835853950711640_1058_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:18.099 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014161013835853950711640_1058_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/task_202502102014161013835853950711640_1058_r_000000
20:14:18.099 INFO SparkHadoopMapRedUtil - attempt_202502102014161013835853950711640_1058_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:18.100 INFO Executor - Finished task 0.0 in stage 221.0 (TID 277). 1858 bytes result sent to driver
20:14:18.100 INFO TaskSetManager - Finished task 0.0 in stage 221.0 (TID 277) in 1241 ms on localhost (executor driver) (1/1)
20:14:18.101 INFO TaskSchedulerImpl - Removed TaskSet 221.0, whose tasks have all completed, from pool
20:14:18.101 INFO DAGScheduler - ResultStage 221 (runJob at SparkHadoopWriter.scala:83) finished in 1.252 s
20:14:18.101 INFO DAGScheduler - Job 165 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:18.101 INFO TaskSchedulerImpl - Killing all running tasks in stage 221: Stage finished
20:14:18.101 INFO DAGScheduler - Job 165 finished: runJob at SparkHadoopWriter.scala:83, took 1.328017 s
20:14:18.101 INFO SparkHadoopWriter - Start to commit write Job job_202502102014161013835853950711640_1058.
20:14:18.102 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:18.102 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts dst=null perm=null proto=rpc
20:14:18.102 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/task_202502102014161013835853950711640_1058_r_000000 dst=null perm=null proto=rpc
20:14:18.103 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:18.103 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/task_202502102014161013835853950711640_1058_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.104 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:18.104 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/task_202502102014161013835853950711640_1058_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.105 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:18.105 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary/0/task_202502102014161013835853950711640_1058_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.106 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_temporary dst=null perm=null proto=rpc
20:14:18.106 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.107 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:18.108 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.spark-staging-1058 dst=null perm=null proto=rpc
20:14:18.108 INFO SparkHadoopWriter - Write Job job_202502102014161013835853950711640_1058 committed. Elapsed time: 6 ms.
20:14:18.108 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.110 INFO StateChange - BLOCK* allocate blk_1073741881_1057, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/header
20:14:18.111 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741881_1057 src: /127.0.0.1:35002 dest: /127.0.0.1:38353
20:14:18.112 INFO clienttrace - src: /127.0.0.1:35002, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741881_1057, duration(ns): 460538
20:14:18.112 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741881_1057, type=LAST_IN_PIPELINE terminating
20:14:18.112 INFO FSNamesystem - BLOCK* blk_1073741881_1057 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/header
20:14:18.513 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:18.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.515 INFO StateChange - BLOCK* allocate blk_1073741882_1058, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/terminator
20:14:18.516 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741882_1058 src: /127.0.0.1:35008 dest: /127.0.0.1:38353
20:14:18.517 INFO clienttrace - src: /127.0.0.1:35008, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741882_1058, duration(ns): 447762
20:14:18.517 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741882_1058, type=LAST_IN_PIPELINE terminating
20:14:18.518 INFO FSNamesystem - BLOCK* blk_1073741882_1058 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/terminator
20:14:18.919 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:18.920 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts dst=null perm=null proto=rpc
20:14:18.921 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.921 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:18.922 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam
20:14:18.922 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.922 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:18.923 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:18.923 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.924 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam done
20:14:18.924 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:18.924 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi
20:14:18.924 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts dst=null perm=null proto=rpc
20:14:18.925 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:18.926 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:18.927 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:18.928 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
20:14:18.929 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:18.930 INFO StateChange - BLOCK* allocate blk_1073741883_1059, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi
20:14:18.931 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741883_1059 src: /127.0.0.1:35010 dest: /127.0.0.1:38353
20:14:18.932 INFO clienttrace - src: /127.0.0.1:35010, dest: /127.0.0.1:38353, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741883_1059, duration(ns): 550532
20:14:18.932 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741883_1059, type=LAST_IN_PIPELINE terminating
20:14:18.933 INFO FSNamesystem - BLOCK* blk_1073741883_1059 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi
20:14:19.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741872_1048 replica FinalizedReplica, blk_1073741872_1048, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741872 for deletion
20:14:19.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741873_1049 replica FinalizedReplica, blk_1073741873_1049, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741873 for deletion
20:14:19.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741872_1048 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741872
20:14:19.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741873_1049 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741873
20:14:19.333 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:19.334 INFO IndexFileMerger - Done merging .sbi files
20:14:19.334 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai
20:14:19.334 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts dst=null perm=null proto=rpc
20:14:19.335 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:19.336 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:19.336 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:19.337 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:19.337 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:19.338 INFO StateChange - BLOCK* allocate blk_1073741884_1060, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai
20:14:19.339 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741884_1060 src: /127.0.0.1:35026 dest: /127.0.0.1:38353
20:14:19.340 INFO clienttrace - src: /127.0.0.1:35026, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741884_1060, duration(ns): 371065
20:14:19.340 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741884_1060, type=LAST_IN_PIPELINE terminating
20:14:19.341 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:19.341 INFO IndexFileMerger - Done merging .bai files
20:14:19.341 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.parts dst=null perm=null proto=rpc
20:14:19.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.357 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi dst=null perm=null proto=rpc
20:14:19.357 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi dst=null perm=null proto=rpc
20:14:19.358 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi dst=null perm=null proto=rpc
20:14:19.358 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
20:14:19.359 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.359 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.359 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.360 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.360 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.361 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.361 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.362 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:19.364 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:19.364 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:19.364 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:19.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi dst=null perm=null proto=rpc
20:14:19.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi dst=null perm=null proto=rpc
20:14:19.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.sbi dst=null perm=null proto=rpc
20:14:19.366 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
20:14:19.367 INFO MemoryStore - Block broadcast_444 stored as values in memory (estimated size 13.3 KiB, free 1918.3 MiB)
20:14:19.367 INFO MemoryStore - Block broadcast_444_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.3 MiB)
20:14:19.367 INFO BlockManagerInfo - Added broadcast_444_piece0 in memory on localhost:35739 (size: 8.3 KiB, free: 1919.7 MiB)
20:14:19.368 INFO SparkContext - Created broadcast 444 from broadcast at BamSource.java:104
20:14:19.369 INFO MemoryStore - Block broadcast_445 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
20:14:19.375 INFO MemoryStore - Block broadcast_445_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:19.375 INFO BlockManagerInfo - Added broadcast_445_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:19.375 INFO SparkContext - Created broadcast 445 from newAPIHadoopFile at PathSplitSource.java:96
20:14:19.384 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.384 INFO FileInputFormat - Total input files to process : 1
20:14:19.384 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.398 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:19.399 INFO DAGScheduler - Got job 166 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:19.399 INFO DAGScheduler - Final stage: ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:19.399 INFO DAGScheduler - Parents of final stage: List()
20:14:19.399 INFO DAGScheduler - Missing parents: List()
20:14:19.399 INFO DAGScheduler - Submitting ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:19.405 INFO MemoryStore - Block broadcast_446 stored as values in memory (estimated size 148.2 KiB, free 1917.8 MiB)
20:14:19.405 INFO MemoryStore - Block broadcast_446_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
20:14:19.405 INFO BlockManagerInfo - Added broadcast_446_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:19.406 INFO SparkContext - Created broadcast 446 from broadcast at DAGScheduler.scala:1580
20:14:19.406 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:19.406 INFO TaskSchedulerImpl - Adding task set 222.0 with 1 tasks resource profile 0
20:14:19.406 INFO TaskSetManager - Starting task 0.0 in stage 222.0 (TID 278) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:19.406 INFO Executor - Running task 0.0 in stage 222.0 (TID 278)
20:14:19.417 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam:0+237038
20:14:19.418 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.418 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.419 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.420 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.420 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.421 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:19.423 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:19.424 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:19.425 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:19.427 INFO Executor - Finished task 0.0 in stage 222.0 (TID 278). 651483 bytes result sent to driver
20:14:19.429 INFO TaskSetManager - Finished task 0.0 in stage 222.0 (TID 278) in 23 ms on localhost (executor driver) (1/1)
20:14:19.429 INFO TaskSchedulerImpl - Removed TaskSet 222.0, whose tasks have all completed, from pool
20:14:19.429 INFO DAGScheduler - ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.030 s
20:14:19.429 INFO DAGScheduler - Job 166 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:19.429 INFO TaskSchedulerImpl - Killing all running tasks in stage 222: Stage finished
20:14:19.429 INFO DAGScheduler - Job 166 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.030500 s
20:14:19.444 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:19.444 INFO DAGScheduler - Got job 167 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:19.444 INFO DAGScheduler - Final stage: ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185)
20:14:19.444 INFO DAGScheduler - Parents of final stage: List()
20:14:19.444 INFO DAGScheduler - Missing parents: List()
20:14:19.444 INFO DAGScheduler - Submitting ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:19.461 INFO MemoryStore - Block broadcast_447 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:14:19.462 INFO MemoryStore - Block broadcast_447_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
20:14:19.462 INFO BlockManagerInfo - Added broadcast_447_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.4 MiB)
20:14:19.462 INFO SparkContext - Created broadcast 447 from broadcast at DAGScheduler.scala:1580
20:14:19.462 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:19.462 INFO TaskSchedulerImpl - Adding task set 223.0 with 1 tasks resource profile 0
20:14:19.463 INFO TaskSetManager - Starting task 0.0 in stage 223.0 (TID 279) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:19.463 INFO Executor - Running task 0.0 in stage 223.0 (TID 279)
20:14:19.492 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:19.501 INFO Executor - Finished task 0.0 in stage 223.0 (TID 279). 989 bytes result sent to driver
20:14:19.501 INFO TaskSetManager - Finished task 0.0 in stage 223.0 (TID 279) in 38 ms on localhost (executor driver) (1/1)
20:14:19.501 INFO TaskSchedulerImpl - Removed TaskSet 223.0, whose tasks have all completed, from pool
20:14:19.502 INFO DAGScheduler - ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
20:14:19.502 INFO DAGScheduler - Job 167 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:19.502 INFO TaskSchedulerImpl - Killing all running tasks in stage 223: Stage finished
20:14:19.502 INFO DAGScheduler - Job 167 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058028 s
20:14:19.506 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:19.506 INFO DAGScheduler - Got job 168 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:19.506 INFO DAGScheduler - Final stage: ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185)
20:14:19.506 INFO DAGScheduler - Parents of final stage: List()
20:14:19.506 INFO DAGScheduler - Missing parents: List()
20:14:19.507 INFO DAGScheduler - Submitting ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:19.513 INFO MemoryStore - Block broadcast_448 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
20:14:19.513 INFO MemoryStore - Block broadcast_448_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.0 MiB)
20:14:19.513 INFO BlockManagerInfo - Added broadcast_448_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.4 MiB)
20:14:19.514 INFO SparkContext - Created broadcast 448 from broadcast at DAGScheduler.scala:1580
20:14:19.514 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:19.514 INFO TaskSchedulerImpl - Adding task set 224.0 with 1 tasks resource profile 0
20:14:19.514 INFO TaskSetManager - Starting task 0.0 in stage 224.0 (TID 280) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:19.514 INFO Executor - Running task 0.0 in stage 224.0 (TID 280)
20:14:19.525 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam:0+237038
20:14:19.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam dst=null perm=null proto=rpc
20:14:19.527 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.528 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.528 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_9be914ec-1b4e-4a21-81c1-4484ed244bad.bam.bai dst=null perm=null proto=rpc
20:14:19.529 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:19.531 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:19.531 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:19.532 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:19.532 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:19.534 INFO Executor - Finished task 0.0 in stage 224.0 (TID 280). 989 bytes result sent to driver
20:14:19.534 INFO TaskSetManager - Finished task 0.0 in stage 224.0 (TID 280) in 20 ms on localhost (executor driver) (1/1)
20:14:19.534 INFO TaskSchedulerImpl - Removed TaskSet 224.0, whose tasks have all completed, from pool
20:14:19.534 INFO DAGScheduler - ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.027 s
20:14:19.534 INFO DAGScheduler - Job 168 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:19.534 INFO TaskSchedulerImpl - Killing all running tasks in stage 224: Stage finished
20:14:19.534 INFO DAGScheduler - Job 168 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.028234 s
20:14:19.542 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:19.543 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:19.544 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:19.544 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:19.546 INFO MemoryStore - Block broadcast_449 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
20:14:19.552 INFO MemoryStore - Block broadcast_449_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:14:19.552 INFO BlockManagerInfo - Added broadcast_449_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:19.553 INFO SparkContext - Created broadcast 449 from newAPIHadoopFile at PathSplitSource.java:96
20:14:19.573 INFO MemoryStore - Block broadcast_450 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:14:19.579 INFO MemoryStore - Block broadcast_450_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:19.579 INFO BlockManagerInfo - Added broadcast_450_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:19.580 INFO SparkContext - Created broadcast 450 from newAPIHadoopFile at PathSplitSource.java:96
20:14:19.599 INFO FileInputFormat - Total input files to process : 1
20:14:19.601 INFO MemoryStore - Block broadcast_451 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
20:14:19.601 INFO MemoryStore - Block broadcast_451_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
20:14:19.601 INFO BlockManagerInfo - Added broadcast_451_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:19.601 INFO SparkContext - Created broadcast 451 from broadcast at ReadsSparkSink.java:133
20:14:19.602 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:19.603 INFO MemoryStore - Block broadcast_452 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
20:14:19.603 INFO MemoryStore - Block broadcast_452_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
20:14:19.603 INFO BlockManagerInfo - Added broadcast_452_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:19.604 INFO SparkContext - Created broadcast 452 from broadcast at BamSink.java:76
20:14:19.605 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts dst=null perm=null proto=rpc
20:14:19.606 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:19.606 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:19.606 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:19.606 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:19.617 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:19.617 INFO DAGScheduler - Registering RDD 1078 (mapToPair at SparkUtils.java:161) as input to shuffle 45
20:14:19.617 INFO DAGScheduler - Got job 169 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:19.617 INFO DAGScheduler - Final stage: ResultStage 226 (runJob at SparkHadoopWriter.scala:83)
20:14:19.617 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 225)
20:14:19.617 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 225)
20:14:19.617 INFO DAGScheduler - Submitting ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:19.634 INFO MemoryStore - Block broadcast_453 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
20:14:19.636 INFO MemoryStore - Block broadcast_453_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.3 MiB)
20:14:19.636 INFO BlockManagerInfo - Added broadcast_453_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.1 MiB)
20:14:19.636 INFO SparkContext - Created broadcast 453 from broadcast at DAGScheduler.scala:1580
20:14:19.636 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:19.636 INFO TaskSchedulerImpl - Adding task set 225.0 with 1 tasks resource profile 0
20:14:19.636 INFO TaskSetManager - Starting task 0.0 in stage 225.0 (TID 281) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:19.637 INFO Executor - Running task 0.0 in stage 225.0 (TID 281)
20:14:19.666 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:19.681 INFO Executor - Finished task 0.0 in stage 225.0 (TID 281). 1148 bytes result sent to driver
20:14:19.681 INFO TaskSetManager - Finished task 0.0 in stage 225.0 (TID 281) in 45 ms on localhost (executor driver) (1/1)
20:14:19.681 INFO TaskSchedulerImpl - Removed TaskSet 225.0, whose tasks have all completed, from pool
20:14:19.681 INFO DAGScheduler - ShuffleMapStage 225 (mapToPair at SparkUtils.java:161) finished in 0.063 s
20:14:19.681 INFO DAGScheduler - looking for newly runnable stages
20:14:19.681 INFO DAGScheduler - running: HashSet()
20:14:19.681 INFO DAGScheduler - waiting: HashSet(ResultStage 226)
20:14:19.681 INFO DAGScheduler - failed: HashSet()
20:14:19.682 INFO DAGScheduler - Submitting ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91), which has no missing parents
20:14:19.688 INFO MemoryStore - Block broadcast_454 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
20:14:19.693 INFO BlockManagerInfo - Removed broadcast_441_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.1 MiB)
20:14:19.693 INFO MemoryStore - Block broadcast_454_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.2 MiB)
20:14:19.693 INFO BlockManagerInfo - Added broadcast_454_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.1 MiB)
20:14:19.694 INFO SparkContext - Created broadcast 454 from broadcast at DAGScheduler.scala:1580
20:14:19.694 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:19.694 INFO TaskSchedulerImpl - Adding task set 226.0 with 1 tasks resource profile 0
20:14:19.694 INFO BlockManagerInfo - Removed broadcast_447_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.2 MiB)
20:14:19.695 INFO TaskSetManager - Starting task 0.0 in stage 226.0 (TID 282) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:19.695 INFO BlockManagerInfo - Removed broadcast_442_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.4 MiB)
20:14:19.695 INFO Executor - Running task 0.0 in stage 226.0 (TID 282)
20:14:19.695 INFO BlockManagerInfo - Removed broadcast_444_piece0 on localhost:35739 in memory (size: 8.3 KiB, free: 1919.4 MiB)
20:14:19.696 INFO BlockManagerInfo - Removed broadcast_448_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.4 MiB)
20:14:19.696 INFO BlockManagerInfo - Removed broadcast_440_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:19.697 INFO BlockManagerInfo - Removed broadcast_443_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.5 MiB)
20:14:19.697 INFO BlockManagerInfo - Removed broadcast_445_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:19.698 INFO BlockManagerInfo - Removed broadcast_438_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:19.698 INFO BlockManagerInfo - Removed broadcast_446_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.7 MiB)
20:14:19.698 INFO BlockManagerInfo - Removed broadcast_450_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:19.701 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:19.701 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:19.715 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:19.715 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:19.715 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:19.715 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:19.715 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:19.715 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:19.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:19.717 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:19.720 INFO StateChange - BLOCK* allocate blk_1073741885_1061, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/part-r-00000
20:14:19.721 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741885_1061 src: /127.0.0.1:35042 dest: /127.0.0.1:38353
20:14:19.723 INFO clienttrace - src: /127.0.0.1:35042, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741885_1061, duration(ns): 1288178
20:14:19.723 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741885_1061, type=LAST_IN_PIPELINE terminating
20:14:19.724 INFO FSNamesystem - BLOCK* blk_1073741885_1061 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/part-r-00000
20:14:20.125 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:20.125 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:14:20.127 INFO StateChange - BLOCK* allocate blk_1073741886_1062, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/.part-r-00000.bai
20:14:20.128 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741886_1062 src: /127.0.0.1:35044 dest: /127.0.0.1:38353
20:14:20.129 INFO clienttrace - src: /127.0.0.1:35044, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741886_1062, duration(ns): 408595
20:14:20.129 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741886_1062, type=LAST_IN_PIPELINE terminating
20:14:20.129 INFO FSNamesystem - BLOCK* blk_1073741886_1062 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/.part-r-00000.bai
20:14:20.530 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:20.531 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0 dst=null perm=null proto=rpc
20:14:20.531 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0 dst=null perm=null proto=rpc
20:14:20.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/task_202502102014198916186941333771370_1083_r_000000 dst=null perm=null proto=rpc
20:14:20.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/_temporary/attempt_202502102014198916186941333771370_1083_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/task_202502102014198916186941333771370_1083_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:20.532 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014198916186941333771370_1083_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/task_202502102014198916186941333771370_1083_r_000000
20:14:20.533 INFO SparkHadoopMapRedUtil - attempt_202502102014198916186941333771370_1083_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:20.533 INFO Executor - Finished task 0.0 in stage 226.0 (TID 282). 1858 bytes result sent to driver
20:14:20.533 INFO TaskSetManager - Finished task 0.0 in stage 226.0 (TID 282) in 839 ms on localhost (executor driver) (1/1)
20:14:20.533 INFO TaskSchedulerImpl - Removed TaskSet 226.0, whose tasks have all completed, from pool
20:14:20.534 INFO DAGScheduler - ResultStage 226 (runJob at SparkHadoopWriter.scala:83) finished in 0.851 s
20:14:20.534 INFO DAGScheduler - Job 169 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:20.534 INFO TaskSchedulerImpl - Killing all running tasks in stage 226: Stage finished
20:14:20.534 INFO DAGScheduler - Job 169 finished: runJob at SparkHadoopWriter.scala:83, took 0.917005 s
20:14:20.534 INFO SparkHadoopWriter - Start to commit write Job job_202502102014198916186941333771370_1083.
20:14:20.534 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:20.535 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts dst=null perm=null proto=rpc
20:14:20.535 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/task_202502102014198916186941333771370_1083_r_000000 dst=null perm=null proto=rpc
20:14:20.536 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:20.536 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/task_202502102014198916186941333771370_1083_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.536 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:20.537 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary/0/task_202502102014198916186941333771370_1083_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.537 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_temporary dst=null perm=null proto=rpc
20:14:20.538 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.539 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:20.539 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/.spark-staging-1083 dst=null perm=null proto=rpc
20:14:20.539 INFO SparkHadoopWriter - Write Job job_202502102014198916186941333771370_1083 committed. Elapsed time: 5 ms.
20:14:20.540 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.541 INFO StateChange - BLOCK* allocate blk_1073741887_1063, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/header
20:14:20.542 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741887_1063 src: /127.0.0.1:35048 dest: /127.0.0.1:38353
20:14:20.543 INFO clienttrace - src: /127.0.0.1:35048, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741887_1063, duration(ns): 447999
20:14:20.543 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741887_1063, type=LAST_IN_PIPELINE terminating
20:14:20.544 INFO FSNamesystem - BLOCK* blk_1073741887_1063 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/header
20:14:20.945 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:20.946 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.947 INFO StateChange - BLOCK* allocate blk_1073741888_1064, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/terminator
20:14:20.948 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741888_1064 src: /127.0.0.1:35050 dest: /127.0.0.1:38353
20:14:20.949 INFO clienttrace - src: /127.0.0.1:35050, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741888_1064, duration(ns): 399967
20:14:20.949 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741888_1064, type=LAST_IN_PIPELINE terminating
20:14:20.950 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:20.950 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts dst=null perm=null proto=rpc
20:14:20.951 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.952 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:20.952 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam
20:14:20.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.953 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:20.953 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:20.954 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.954 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam done
20:14:20.954 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:20.954 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai
20:14:20.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts dst=null perm=null proto=rpc
20:14:20.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:20.956 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:20.956 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:20.957 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:20.958 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:20.958 INFO StateChange - BLOCK* allocate blk_1073741889_1065, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai
20:14:20.959 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741889_1065 src: /127.0.0.1:35060 dest: /127.0.0.1:38353
20:14:20.960 INFO clienttrace - src: /127.0.0.1:35060, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741889_1065, duration(ns): 381565
20:14:20.960 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741889_1065, type=LAST_IN_PIPELINE terminating
20:14:20.961 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:20.961 INFO IndexFileMerger - Done merging .bai files
20:14:20.961 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.parts dst=null perm=null proto=rpc
20:14:20.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:20.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:20.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:20.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:20.972 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:20.972 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:20.973 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:20.973 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:20.974 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:20.976 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:20.976 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:20.976 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.sbi dst=null perm=null proto=rpc
20:14:20.977 INFO MemoryStore - Block broadcast_455 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:20.984 INFO MemoryStore - Block broadcast_455_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:20.984 INFO BlockManagerInfo - Added broadcast_455_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:20.984 INFO SparkContext - Created broadcast 455 from newAPIHadoopFile at PathSplitSource.java:96
20:14:21.004 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.004 INFO FileInputFormat - Total input files to process : 1
20:14:21.004 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.040 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:21.040 INFO DAGScheduler - Got job 170 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:21.040 INFO DAGScheduler - Final stage: ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:21.040 INFO DAGScheduler - Parents of final stage: List()
20:14:21.040 INFO DAGScheduler - Missing parents: List()
20:14:21.040 INFO DAGScheduler - Submitting ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:21.057 INFO MemoryStore - Block broadcast_456 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
20:14:21.058 INFO MemoryStore - Block broadcast_456_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
20:14:21.058 INFO BlockManagerInfo - Added broadcast_456_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:21.058 INFO SparkContext - Created broadcast 456 from broadcast at DAGScheduler.scala:1580
20:14:21.058 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:21.058 INFO TaskSchedulerImpl - Adding task set 227.0 with 1 tasks resource profile 0
20:14:21.059 INFO TaskSetManager - Starting task 0.0 in stage 227.0 (TID 283) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:21.059 INFO Executor - Running task 0.0 in stage 227.0 (TID 283)
20:14:21.088 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam:0+237038
20:14:21.088 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.089 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.090 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:21.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.090 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.091 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.091 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.092 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.093 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:21.095 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:21.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.097 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.103 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.103 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.104 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.104 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.105 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.106 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.107 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.107 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.108 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.108 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.109 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.110 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.110 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.111 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.111 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.113 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.113 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.114 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.114 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.116 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.116 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.117 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.118 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.118 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.119 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.120 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.120 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.121 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.121 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.122 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.123 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.123 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.124 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.124 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.126 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.129 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.130 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.130 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.131 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.132 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.132 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.134 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.134 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.135 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.136 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.136 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.138 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.138 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.139 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.140 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.140 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.142 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.143 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.143 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.145 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.145 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.147 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.147 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.148 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.148 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.148 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.149 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.152 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:21.154 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:21.154 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:21.155 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.156 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:21.158 INFO Executor - Finished task 0.0 in stage 227.0 (TID 283). 651483 bytes result sent to driver
20:14:21.159 INFO TaskSetManager - Finished task 0.0 in stage 227.0 (TID 283) in 100 ms on localhost (executor driver) (1/1)
20:14:21.159 INFO TaskSchedulerImpl - Removed TaskSet 227.0, whose tasks have all completed, from pool
20:14:21.159 INFO DAGScheduler - ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.118 s
20:14:21.159 INFO DAGScheduler - Job 170 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:21.159 INFO TaskSchedulerImpl - Killing all running tasks in stage 227: Stage finished
20:14:21.160 INFO DAGScheduler - Job 170 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.119783 s
20:14:21.169 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:21.169 INFO DAGScheduler - Got job 171 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:21.169 INFO DAGScheduler - Final stage: ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185)
20:14:21.169 INFO DAGScheduler - Parents of final stage: List()
20:14:21.169 INFO DAGScheduler - Missing parents: List()
20:14:21.169 INFO DAGScheduler - Submitting ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:21.186 INFO MemoryStore - Block broadcast_457 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
20:14:21.187 INFO MemoryStore - Block broadcast_457_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
20:14:21.187 INFO BlockManagerInfo - Added broadcast_457_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.4 MiB)
20:14:21.187 INFO SparkContext - Created broadcast 457 from broadcast at DAGScheduler.scala:1580
20:14:21.187 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:21.187 INFO TaskSchedulerImpl - Adding task set 228.0 with 1 tasks resource profile 0
20:14:21.188 INFO TaskSetManager - Starting task 0.0 in stage 228.0 (TID 284) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:21.188 INFO Executor - Running task 0.0 in stage 228.0 (TID 284)
20:14:21.216 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:21.226 INFO Executor - Finished task 0.0 in stage 228.0 (TID 284). 989 bytes result sent to driver
20:14:21.226 INFO TaskSetManager - Finished task 0.0 in stage 228.0 (TID 284) in 38 ms on localhost (executor driver) (1/1)
20:14:21.226 INFO TaskSchedulerImpl - Removed TaskSet 228.0, whose tasks have all completed, from pool
20:14:21.226 INFO DAGScheduler - ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:21.226 INFO DAGScheduler - Job 171 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:21.226 INFO TaskSchedulerImpl - Killing all running tasks in stage 228: Stage finished
20:14:21.226 INFO DAGScheduler - Job 171 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057570 s
20:14:21.230 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:21.230 INFO DAGScheduler - Got job 172 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:21.230 INFO DAGScheduler - Final stage: ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185)
20:14:21.230 INFO DAGScheduler - Parents of final stage: List()
20:14:21.230 INFO DAGScheduler - Missing parents: List()
20:14:21.230 INFO DAGScheduler - Submitting ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:21.253 INFO MemoryStore - Block broadcast_458 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
20:14:21.254 INFO MemoryStore - Block broadcast_458_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
20:14:21.254 INFO BlockManagerInfo - Added broadcast_458_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.2 MiB)
20:14:21.254 INFO SparkContext - Created broadcast 458 from broadcast at DAGScheduler.scala:1580
20:14:21.254 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:21.254 INFO TaskSchedulerImpl - Adding task set 229.0 with 1 tasks resource profile 0
20:14:21.255 INFO TaskSetManager - Starting task 0.0 in stage 229.0 (TID 285) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:21.255 INFO Executor - Running task 0.0 in stage 229.0 (TID 285)
20:14:21.284 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam:0+237038
20:14:21.284 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.285 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.286 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.287 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.287 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.288 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.288 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.289 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:21.291 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:21.291 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:21.291 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.292 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.293 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:21.296 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.297 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.298 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.298 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.299 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.300 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.301 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.302 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.302 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.303 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.304 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.305 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.305 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.307 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.309 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.309 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.310 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.311 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.312 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.313 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.315 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.315 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.316 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.316 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.317 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.317 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.319 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.319 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.320 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.321 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.321 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.322 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.322 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.323 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.324 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.324 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.326 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.326 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.327 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.327 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.329 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.329 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.330 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.331 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.331 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.333 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.333 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.334 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.335 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.336 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.336 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.338 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.338 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.339 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.340 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:21.340 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.340 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.341 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.341 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam dst=null perm=null proto=rpc
20:14:21.342 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.342 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.343 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_fec70bf2-94e6-4556-9ecc-d373ea39cfd5.bam.bai dst=null perm=null proto=rpc
20:14:21.344 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:21.347 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:21.348 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:21.348 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:21.349 INFO Executor - Finished task 0.0 in stage 229.0 (TID 285). 989 bytes result sent to driver
20:14:21.350 INFO TaskSetManager - Finished task 0.0 in stage 229.0 (TID 285) in 95 ms on localhost (executor driver) (1/1)
20:14:21.350 INFO TaskSchedulerImpl - Removed TaskSet 229.0, whose tasks have all completed, from pool
20:14:21.350 INFO DAGScheduler - ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.120 s
20:14:21.350 INFO DAGScheduler - Job 172 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:21.350 INFO TaskSchedulerImpl - Killing all running tasks in stage 229: Stage finished
20:14:21.350 INFO DAGScheduler - Job 172 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.120668 s
20:14:21.364 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:21.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:21.365 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:21.366 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:21.369 INFO MemoryStore - Block broadcast_459 stored as values in memory (estimated size 297.9 KiB, free 1916.0 MiB)
20:14:21.380 INFO MemoryStore - Block broadcast_459_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
20:14:21.380 INFO BlockManagerInfo - Added broadcast_459_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:21.380 INFO SparkContext - Created broadcast 459 from newAPIHadoopFile at PathSplitSource.java:96
20:14:21.401 INFO MemoryStore - Block broadcast_460 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
20:14:21.407 INFO MemoryStore - Block broadcast_460_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
20:14:21.407 INFO BlockManagerInfo - Added broadcast_460_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.1 MiB)
20:14:21.408 INFO SparkContext - Created broadcast 460 from newAPIHadoopFile at PathSplitSource.java:96
20:14:21.427 INFO FileInputFormat - Total input files to process : 1
20:14:21.429 INFO MemoryStore - Block broadcast_461 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
20:14:21.429 INFO MemoryStore - Block broadcast_461_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
20:14:21.430 INFO BlockManagerInfo - Added broadcast_461_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.1 MiB)
20:14:21.430 INFO SparkContext - Created broadcast 461 from broadcast at ReadsSparkSink.java:133
20:14:21.430 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:21.431 INFO MemoryStore - Block broadcast_462 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
20:14:21.432 INFO MemoryStore - Block broadcast_462_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
20:14:21.432 INFO BlockManagerInfo - Added broadcast_462_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.1 MiB)
20:14:21.432 INFO SparkContext - Created broadcast 462 from broadcast at BamSink.java:76
20:14:21.434 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts dst=null perm=null proto=rpc
20:14:21.434 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:21.434 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:21.434 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:21.435 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:21.441 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:21.441 INFO DAGScheduler - Registering RDD 1104 (mapToPair at SparkUtils.java:161) as input to shuffle 46
20:14:21.441 INFO DAGScheduler - Got job 173 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:21.441 INFO DAGScheduler - Final stage: ResultStage 231 (runJob at SparkHadoopWriter.scala:83)
20:14:21.441 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 230)
20:14:21.441 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 230)
20:14:21.441 INFO DAGScheduler - Submitting ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:21.458 INFO MemoryStore - Block broadcast_463 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
20:14:21.459 INFO MemoryStore - Block broadcast_463_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
20:14:21.459 INFO BlockManagerInfo - Added broadcast_463_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1918.9 MiB)
20:14:21.460 INFO SparkContext - Created broadcast 463 from broadcast at DAGScheduler.scala:1580
20:14:21.460 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:21.460 INFO TaskSchedulerImpl - Adding task set 230.0 with 1 tasks resource profile 0
20:14:21.460 INFO TaskSetManager - Starting task 0.0 in stage 230.0 (TID 286) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:21.460 INFO Executor - Running task 0.0 in stage 230.0 (TID 286)
20:14:21.489 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:21.504 INFO Executor - Finished task 0.0 in stage 230.0 (TID 286). 1148 bytes result sent to driver
20:14:21.508 INFO TaskSetManager - Finished task 0.0 in stage 230.0 (TID 286) in 48 ms on localhost (executor driver) (1/1)
20:14:21.508 INFO TaskSchedulerImpl - Removed TaskSet 230.0, whose tasks have all completed, from pool
20:14:21.508 INFO DAGScheduler - ShuffleMapStage 230 (mapToPair at SparkUtils.java:161) finished in 0.066 s
20:14:21.508 INFO DAGScheduler - looking for newly runnable stages
20:14:21.508 INFO DAGScheduler - running: HashSet()
20:14:21.508 INFO DAGScheduler - waiting: HashSet(ResultStage 231)
20:14:21.508 INFO DAGScheduler - failed: HashSet()
20:14:21.508 INFO BlockManagerInfo - Removed broadcast_451_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1918.9 MiB)
20:14:21.508 INFO DAGScheduler - Submitting ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91), which has no missing parents
20:14:21.509 INFO BlockManagerInfo - Removed broadcast_449_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.0 MiB)
20:14:21.509 INFO BlockManagerInfo - Removed broadcast_457_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.1 MiB)
20:14:21.510 INFO BlockManagerInfo - Removed broadcast_453_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.3 MiB)
20:14:21.510 INFO BlockManagerInfo - Removed broadcast_460_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.3 MiB)
20:14:21.511 INFO BlockManagerInfo - Removed broadcast_452_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:21.511 INFO BlockManagerInfo - Removed broadcast_454_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.4 MiB)
20:14:21.511 INFO BlockManagerInfo - Removed broadcast_455_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:21.512 INFO BlockManagerInfo - Removed broadcast_458_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.6 MiB)
20:14:21.512 INFO BlockManagerInfo - Removed broadcast_456_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:14:21.516 INFO MemoryStore - Block broadcast_464 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
20:14:21.517 INFO MemoryStore - Block broadcast_464_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
20:14:21.517 INFO BlockManagerInfo - Added broadcast_464_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.7 MiB)
20:14:21.517 INFO SparkContext - Created broadcast 464 from broadcast at DAGScheduler.scala:1580
20:14:21.518 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:21.518 INFO TaskSchedulerImpl - Adding task set 231.0 with 1 tasks resource profile 0
20:14:21.518 INFO TaskSetManager - Starting task 0.0 in stage 231.0 (TID 287) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:21.518 INFO Executor - Running task 0.0 in stage 231.0 (TID 287)
20:14:21.525 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:21.525 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:21.536 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:21.536 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:21.536 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:21.536 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:21.536 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:21.536 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:21.537 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:21.538 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:21.540 INFO StateChange - BLOCK* allocate blk_1073741890_1066, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/part-r-00000
20:14:21.541 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741890_1066 src: /127.0.0.1:35782 dest: /127.0.0.1:38353
20:14:21.542 INFO clienttrace - src: /127.0.0.1:35782, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741890_1066, duration(ns): 1154961
20:14:21.543 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741890_1066, type=LAST_IN_PIPELINE terminating
20:14:21.543 INFO FSNamesystem - BLOCK* blk_1073741890_1066 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/part-r-00000
20:14:21.944 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:21.944 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:14:21.945 INFO StateChange - BLOCK* allocate blk_1073741891_1067, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/.part-r-00000.sbi
20:14:21.946 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741891_1067 src: /127.0.0.1:35798 dest: /127.0.0.1:38353
20:14:21.947 INFO clienttrace - src: /127.0.0.1:35798, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741891_1067, duration(ns): 406683
20:14:21.947 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741891_1067, type=LAST_IN_PIPELINE terminating
20:14:21.948 INFO FSNamesystem - BLOCK* blk_1073741891_1067 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/.part-r-00000.sbi
20:14:22.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741879_1055 replica FinalizedReplica, blk_1073741879_1055, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741879 for deletion
20:14:22.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741880_1056 replica FinalizedReplica, blk_1073741880_1056, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741880 for deletion
20:14:22.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741886_1062 replica FinalizedReplica, blk_1073741886_1062, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741886 for deletion
20:14:22.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741879_1055 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741879
20:14:22.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741880_1056 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741880
20:14:22.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741886_1062 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741886
20:14:22.349 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:22.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0 dst=null perm=null proto=rpc
20:14:22.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0 dst=null perm=null proto=rpc
20:14:22.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/task_202502102014213151126610768310689_1109_r_000000 dst=null perm=null proto=rpc
20:14:22.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/_temporary/attempt_202502102014213151126610768310689_1109_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/task_202502102014213151126610768310689_1109_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:22.351 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014213151126610768310689_1109_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/task_202502102014213151126610768310689_1109_r_000000
20:14:22.351 INFO SparkHadoopMapRedUtil - attempt_202502102014213151126610768310689_1109_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:22.352 INFO Executor - Finished task 0.0 in stage 231.0 (TID 287). 1858 bytes result sent to driver
20:14:22.352 INFO TaskSetManager - Finished task 0.0 in stage 231.0 (TID 287) in 834 ms on localhost (executor driver) (1/1)
20:14:22.352 INFO TaskSchedulerImpl - Removed TaskSet 231.0, whose tasks have all completed, from pool
20:14:22.352 INFO DAGScheduler - ResultStage 231 (runJob at SparkHadoopWriter.scala:83) finished in 0.843 s
20:14:22.352 INFO DAGScheduler - Job 173 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:22.352 INFO TaskSchedulerImpl - Killing all running tasks in stage 231: Stage finished
20:14:22.352 INFO DAGScheduler - Job 173 finished: runJob at SparkHadoopWriter.scala:83, took 0.911647 s
20:14:22.353 INFO SparkHadoopWriter - Start to commit write Job job_202502102014213151126610768310689_1109.
20:14:22.353 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:22.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts dst=null perm=null proto=rpc
20:14:22.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/task_202502102014213151126610768310689_1109_r_000000 dst=null perm=null proto=rpc
20:14:22.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:22.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/task_202502102014213151126610768310689_1109_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:22.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:22.356 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary/0/task_202502102014213151126610768310689_1109_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:22.356 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_temporary dst=null perm=null proto=rpc
20:14:22.357 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:22.357 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:22.358 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/.spark-staging-1109 dst=null perm=null proto=rpc
20:14:22.358 INFO SparkHadoopWriter - Write Job job_202502102014213151126610768310689_1109 committed. Elapsed time: 5 ms.
20:14:22.359 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:22.360 INFO StateChange - BLOCK* allocate blk_1073741892_1068, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/header
20:14:22.361 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741892_1068 src: /127.0.0.1:35804 dest: /127.0.0.1:38353
20:14:22.362 INFO clienttrace - src: /127.0.0.1:35804, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741892_1068, duration(ns): 447829
20:14:22.362 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741892_1068, type=LAST_IN_PIPELINE terminating
20:14:22.362 INFO FSNamesystem - BLOCK* blk_1073741892_1068 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/header
20:14:22.763 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:22.764 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:22.765 INFO StateChange - BLOCK* allocate blk_1073741893_1069, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/terminator
20:14:22.766 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741893_1069 src: /127.0.0.1:35810 dest: /127.0.0.1:38353
20:14:22.767 INFO clienttrace - src: /127.0.0.1:35810, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741893_1069, duration(ns): 405229
20:14:22.767 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741893_1069, type=LAST_IN_PIPELINE terminating
20:14:22.767 INFO FSNamesystem - BLOCK* blk_1073741893_1069 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/terminator
20:14:23.168 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:23.169 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts dst=null perm=null proto=rpc
20:14:23.170 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:23.170 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:23.170 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam
20:14:23.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:23.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:23.172 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam done
20:14:23.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.173 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi
20:14:23.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts dst=null perm=null proto=rpc
20:14:23.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:23.174 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:23.175 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:23.176 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:23.176 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:23.177 INFO StateChange - BLOCK* allocate blk_1073741894_1070, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi
20:14:23.177 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741894_1070 src: /127.0.0.1:35816 dest: /127.0.0.1:38353
20:14:23.178 INFO clienttrace - src: /127.0.0.1:35816, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741894_1070, duration(ns): 395591
20:14:23.178 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741894_1070, type=LAST_IN_PIPELINE terminating
20:14:23.179 INFO FSNamesystem - BLOCK* blk_1073741894_1070 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi
20:14:23.579 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:23.580 INFO IndexFileMerger - Done merging .sbi files
20:14:23.580 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.parts dst=null perm=null proto=rpc
20:14:23.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi dst=null perm=null proto=rpc
20:14:23.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi dst=null perm=null proto=rpc
20:14:23.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi dst=null perm=null proto=rpc
20:14:23.591 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:23.591 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.591 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.592 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.592 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.593 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.bai dst=null perm=null proto=rpc
20:14:23.593 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bai dst=null perm=null proto=rpc
20:14:23.594 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:23.595 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:23.595 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi dst=null perm=null proto=rpc
20:14:23.596 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi dst=null perm=null proto=rpc
20:14:23.596 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.sbi dst=null perm=null proto=rpc
20:14:23.597 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:23.597 INFO MemoryStore - Block broadcast_465 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
20:14:23.597 INFO MemoryStore - Block broadcast_465_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
20:14:23.597 INFO BlockManagerInfo - Added broadcast_465_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.7 MiB)
20:14:23.598 INFO SparkContext - Created broadcast 465 from broadcast at BamSource.java:104
20:14:23.598 INFO MemoryStore - Block broadcast_466 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:23.605 INFO MemoryStore - Block broadcast_466_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:23.605 INFO BlockManagerInfo - Added broadcast_466_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:23.605 INFO SparkContext - Created broadcast 466 from newAPIHadoopFile at PathSplitSource.java:96
20:14:23.613 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.614 INFO FileInputFormat - Total input files to process : 1
20:14:23.614 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.628 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:23.629 INFO DAGScheduler - Got job 174 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:23.629 INFO DAGScheduler - Final stage: ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:23.629 INFO DAGScheduler - Parents of final stage: List()
20:14:23.629 INFO DAGScheduler - Missing parents: List()
20:14:23.629 INFO DAGScheduler - Submitting ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:23.635 INFO MemoryStore - Block broadcast_467 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
20:14:23.635 INFO MemoryStore - Block broadcast_467_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
20:14:23.635 INFO BlockManagerInfo - Added broadcast_467_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:23.635 INFO SparkContext - Created broadcast 467 from broadcast at DAGScheduler.scala:1580
20:14:23.636 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:23.636 INFO TaskSchedulerImpl - Adding task set 232.0 with 1 tasks resource profile 0
20:14:23.636 INFO TaskSetManager - Starting task 0.0 in stage 232.0 (TID 288) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:23.636 INFO Executor - Running task 0.0 in stage 232.0 (TID 288)
20:14:23.648 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam:0+237038
20:14:23.648 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.649 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.649 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.bai dst=null perm=null proto=rpc
20:14:23.649 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bai dst=null perm=null proto=rpc
20:14:23.650 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:23.652 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:23.653 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:23.654 INFO Executor - Finished task 0.0 in stage 232.0 (TID 288). 651483 bytes result sent to driver
20:14:23.656 INFO TaskSetManager - Finished task 0.0 in stage 232.0 (TID 288) in 20 ms on localhost (executor driver) (1/1)
20:14:23.656 INFO TaskSchedulerImpl - Removed TaskSet 232.0, whose tasks have all completed, from pool
20:14:23.656 INFO DAGScheduler - ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
20:14:23.656 INFO DAGScheduler - Job 174 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:23.656 INFO TaskSchedulerImpl - Killing all running tasks in stage 232: Stage finished
20:14:23.656 INFO DAGScheduler - Job 174 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.027681 s
20:14:23.666 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:23.666 INFO DAGScheduler - Got job 175 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:23.666 INFO DAGScheduler - Final stage: ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185)
20:14:23.666 INFO DAGScheduler - Parents of final stage: List()
20:14:23.666 INFO DAGScheduler - Missing parents: List()
20:14:23.666 INFO DAGScheduler - Submitting ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:23.682 INFO MemoryStore - Block broadcast_468 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
20:14:23.684 INFO MemoryStore - Block broadcast_468_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
20:14:23.684 INFO BlockManagerInfo - Added broadcast_468_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:23.684 INFO SparkContext - Created broadcast 468 from broadcast at DAGScheduler.scala:1580
20:14:23.684 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:23.684 INFO TaskSchedulerImpl - Adding task set 233.0 with 1 tasks resource profile 0
20:14:23.684 INFO TaskSetManager - Starting task 0.0 in stage 233.0 (TID 289) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:23.685 INFO Executor - Running task 0.0 in stage 233.0 (TID 289)
20:14:23.713 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:23.722 INFO Executor - Finished task 0.0 in stage 233.0 (TID 289). 989 bytes result sent to driver
20:14:23.723 INFO TaskSetManager - Finished task 0.0 in stage 233.0 (TID 289) in 39 ms on localhost (executor driver) (1/1)
20:14:23.723 INFO TaskSchedulerImpl - Removed TaskSet 233.0, whose tasks have all completed, from pool
20:14:23.723 INFO DAGScheduler - ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.057 s
20:14:23.723 INFO DAGScheduler - Job 175 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:23.723 INFO TaskSchedulerImpl - Killing all running tasks in stage 233: Stage finished
20:14:23.723 INFO DAGScheduler - Job 175 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.057255 s
20:14:23.726 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:23.727 INFO DAGScheduler - Got job 176 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:23.727 INFO DAGScheduler - Final stage: ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185)
20:14:23.727 INFO DAGScheduler - Parents of final stage: List()
20:14:23.727 INFO DAGScheduler - Missing parents: List()
20:14:23.727 INFO DAGScheduler - Submitting ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:23.736 INFO MemoryStore - Block broadcast_469 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
20:14:23.736 INFO MemoryStore - Block broadcast_469_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
20:14:23.736 INFO BlockManagerInfo - Added broadcast_469_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.4 MiB)
20:14:23.737 INFO SparkContext - Created broadcast 469 from broadcast at DAGScheduler.scala:1580
20:14:23.737 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:23.737 INFO TaskSchedulerImpl - Adding task set 234.0 with 1 tasks resource profile 0
20:14:23.737 INFO TaskSetManager - Starting task 0.0 in stage 234.0 (TID 290) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:23.737 INFO Executor - Running task 0.0 in stage 234.0 (TID 290)
20:14:23.748 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam:0+237038
20:14:23.749 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.749 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam dst=null perm=null proto=rpc
20:14:23.750 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bam.bai dst=null perm=null proto=rpc
20:14:23.750 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_01e158a0-7a71-40f0-8383-486f2de2a0d6.bai dst=null perm=null proto=rpc
20:14:23.751 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:23.753 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:23.753 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:23.755 INFO Executor - Finished task 0.0 in stage 234.0 (TID 290). 989 bytes result sent to driver
20:14:23.755 INFO TaskSetManager - Finished task 0.0 in stage 234.0 (TID 290) in 18 ms on localhost (executor driver) (1/1)
20:14:23.755 INFO TaskSchedulerImpl - Removed TaskSet 234.0, whose tasks have all completed, from pool
20:14:23.755 INFO DAGScheduler - ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.028 s
20:14:23.755 INFO DAGScheduler - Job 176 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:23.755 INFO TaskSchedulerImpl - Killing all running tasks in stage 234: Stage finished
20:14:23.756 INFO DAGScheduler - Job 176 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.029146 s
20:14:23.766 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:23.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:23.767 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:23.768 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:23.770 INFO MemoryStore - Block broadcast_470 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
20:14:23.776 INFO MemoryStore - Block broadcast_470_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
20:14:23.776 INFO BlockManagerInfo - Added broadcast_470_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:23.776 INFO SparkContext - Created broadcast 470 from newAPIHadoopFile at PathSplitSource.java:96
20:14:23.797 INFO MemoryStore - Block broadcast_471 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
20:14:23.803 INFO MemoryStore - Block broadcast_471_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
20:14:23.803 INFO BlockManagerInfo - Added broadcast_471_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:23.804 INFO SparkContext - Created broadcast 471 from newAPIHadoopFile at PathSplitSource.java:96
20:14:23.823 INFO FileInputFormat - Total input files to process : 1
20:14:23.825 INFO MemoryStore - Block broadcast_472 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
20:14:23.825 INFO MemoryStore - Block broadcast_472_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
20:14:23.825 INFO BlockManagerInfo - Added broadcast_472_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:23.825 INFO SparkContext - Created broadcast 472 from broadcast at ReadsSparkSink.java:133
20:14:23.826 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:23.826 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
20:14:23.827 INFO MemoryStore - Block broadcast_473 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
20:14:23.827 INFO MemoryStore - Block broadcast_473_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
20:14:23.827 INFO BlockManagerInfo - Added broadcast_473_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:23.828 INFO SparkContext - Created broadcast 473 from broadcast at BamSink.java:76
20:14:23.829 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts dst=null perm=null proto=rpc
20:14:23.830 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:23.830 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:23.830 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:23.830 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:23.837 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:23.837 INFO DAGScheduler - Registering RDD 1129 (mapToPair at SparkUtils.java:161) as input to shuffle 47
20:14:23.837 INFO DAGScheduler - Got job 177 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:23.837 INFO DAGScheduler - Final stage: ResultStage 236 (runJob at SparkHadoopWriter.scala:83)
20:14:23.837 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 235)
20:14:23.838 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 235)
20:14:23.838 INFO DAGScheduler - Submitting ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:23.854 INFO MemoryStore - Block broadcast_474 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
20:14:23.856 INFO MemoryStore - Block broadcast_474_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
20:14:23.856 INFO BlockManagerInfo - Added broadcast_474_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.1 MiB)
20:14:23.856 INFO SparkContext - Created broadcast 474 from broadcast at DAGScheduler.scala:1580
20:14:23.856 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:23.856 INFO TaskSchedulerImpl - Adding task set 235.0 with 1 tasks resource profile 0
20:14:23.857 INFO TaskSetManager - Starting task 0.0 in stage 235.0 (TID 291) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:23.857 INFO Executor - Running task 0.0 in stage 235.0 (TID 291)
20:14:23.886 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:23.905 INFO Executor - Finished task 0.0 in stage 235.0 (TID 291). 1148 bytes result sent to driver
20:14:23.905 INFO TaskSetManager - Finished task 0.0 in stage 235.0 (TID 291) in 49 ms on localhost (executor driver) (1/1)
20:14:23.905 INFO TaskSchedulerImpl - Removed TaskSet 235.0, whose tasks have all completed, from pool
20:14:23.905 INFO DAGScheduler - ShuffleMapStage 235 (mapToPair at SparkUtils.java:161) finished in 0.067 s
20:14:23.905 INFO DAGScheduler - looking for newly runnable stages
20:14:23.905 INFO DAGScheduler - running: HashSet()
20:14:23.905 INFO DAGScheduler - waiting: HashSet(ResultStage 236)
20:14:23.905 INFO DAGScheduler - failed: HashSet()
20:14:23.906 INFO DAGScheduler - Submitting ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91), which has no missing parents
20:14:23.913 INFO MemoryStore - Block broadcast_475 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
20:14:23.914 INFO MemoryStore - Block broadcast_475_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.1 MiB)
20:14:23.914 INFO BlockManagerInfo - Added broadcast_475_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.1 MiB)
20:14:23.914 INFO SparkContext - Created broadcast 475 from broadcast at DAGScheduler.scala:1580
20:14:23.914 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:23.914 INFO TaskSchedulerImpl - Adding task set 236.0 with 1 tasks resource profile 0
20:14:23.915 INFO TaskSetManager - Starting task 0.0 in stage 236.0 (TID 292) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:23.915 INFO Executor - Running task 0.0 in stage 236.0 (TID 292)
20:14:23.919 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:23.919 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:23.930 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:23.930 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:23.930 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:23.930 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:23.930 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:23.930 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:23.931 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:23.933 INFO StateChange - BLOCK* allocate blk_1073741895_1071, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0/part-r-00000
20:14:23.934 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741895_1071 src: /127.0.0.1:35832 dest: /127.0.0.1:38353
20:14:23.936 INFO clienttrace - src: /127.0.0.1:35832, dest: /127.0.0.1:38353, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741895_1071, duration(ns): 897741
20:14:23.936 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741895_1071, type=LAST_IN_PIPELINE terminating
20:14:23.936 INFO FSNamesystem - BLOCK* blk_1073741895_1071 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0/part-r-00000
20:14:24.337 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.338 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:14:24.338 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0 dst=null perm=null proto=rpc
20:14:24.339 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0 dst=null perm=null proto=rpc
20:14:24.339 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/task_202502102014238237639674418202283_1134_r_000000 dst=null perm=null proto=rpc
20:14:24.340 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/_temporary/attempt_202502102014238237639674418202283_1134_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/task_202502102014238237639674418202283_1134_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:24.340 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014238237639674418202283_1134_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/task_202502102014238237639674418202283_1134_r_000000
20:14:24.340 INFO SparkHadoopMapRedUtil - attempt_202502102014238237639674418202283_1134_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:24.340 INFO Executor - Finished task 0.0 in stage 236.0 (TID 292). 1858 bytes result sent to driver
20:14:24.341 INFO TaskSetManager - Finished task 0.0 in stage 236.0 (TID 292) in 427 ms on localhost (executor driver) (1/1)
20:14:24.341 INFO TaskSchedulerImpl - Removed TaskSet 236.0, whose tasks have all completed, from pool
20:14:24.341 INFO DAGScheduler - ResultStage 236 (runJob at SparkHadoopWriter.scala:83) finished in 0.435 s
20:14:24.341 INFO DAGScheduler - Job 177 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:24.341 INFO TaskSchedulerImpl - Killing all running tasks in stage 236: Stage finished
20:14:24.341 INFO DAGScheduler - Job 177 finished: runJob at SparkHadoopWriter.scala:83, took 0.504141 s
20:14:24.341 INFO SparkHadoopWriter - Start to commit write Job job_202502102014238237639674418202283_1134.
20:14:24.342 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:24.342 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts dst=null perm=null proto=rpc
20:14:24.342 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/task_202502102014238237639674418202283_1134_r_000000 dst=null perm=null proto=rpc
20:14:24.343 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:24.343 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary/0/task_202502102014238237639674418202283_1134_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.344 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_temporary dst=null perm=null proto=rpc
20:14:24.344 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.345 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.345 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/.spark-staging-1134 dst=null perm=null proto=rpc
20:14:24.345 INFO SparkHadoopWriter - Write Job job_202502102014238237639674418202283_1134 committed. Elapsed time: 4 ms.
20:14:24.346 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.347 INFO StateChange - BLOCK* allocate blk_1073741896_1072, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/header
20:14:24.348 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741896_1072 src: /127.0.0.1:35838 dest: /127.0.0.1:38353
20:14:24.349 INFO clienttrace - src: /127.0.0.1:35838, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741896_1072, duration(ns): 418143
20:14:24.349 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741896_1072, type=LAST_IN_PIPELINE terminating
20:14:24.350 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.352 INFO StateChange - BLOCK* allocate blk_1073741897_1073, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/terminator
20:14:24.352 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741897_1073 src: /127.0.0.1:35844 dest: /127.0.0.1:38353
20:14:24.353 INFO clienttrace - src: /127.0.0.1:35844, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741897_1073, duration(ns): 403465
20:14:24.353 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741897_1073, type=LAST_IN_PIPELINE terminating
20:14:24.354 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts dst=null perm=null proto=rpc
20:14:24.356 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.356 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.357 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam
20:14:24.357 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.357 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.358 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.358 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.358 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam done
20:14:24.359 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.359 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.parts dst=null perm=null proto=rpc
20:14:24.360 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.360 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.361 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.361 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.362 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.bai dst=null perm=null proto=rpc
20:14:24.362 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bai dst=null perm=null proto=rpc
20:14:24.363 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.364 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.364 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.sbi dst=null perm=null proto=rpc
20:14:24.365 INFO MemoryStore - Block broadcast_476 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
20:14:24.371 INFO BlockManagerInfo - Removed broadcast_474_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.2 MiB)
20:14:24.372 INFO BlockManagerInfo - Removed broadcast_468_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.4 MiB)
20:14:24.373 INFO BlockManagerInfo - Removed broadcast_471_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.4 MiB)
20:14:24.374 INFO BlockManagerInfo - Removed broadcast_472_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:24.374 INFO BlockManagerInfo - Removed broadcast_463_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.6 MiB)
20:14:24.375 INFO BlockManagerInfo - Removed broadcast_459_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:24.375 INFO BlockManagerInfo - Removed broadcast_461_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.6 MiB)
20:14:24.376 INFO BlockManagerInfo - Removed broadcast_467_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.7 MiB)
20:14:24.376 INFO BlockManagerInfo - Removed broadcast_473_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:14:24.377 INFO BlockManagerInfo - Removed broadcast_475_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.8 MiB)
20:14:24.378 INFO BlockManagerInfo - Removed broadcast_462_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.8 MiB)
20:14:24.378 INFO BlockManagerInfo - Removed broadcast_469_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.8 MiB)
20:14:24.379 INFO MemoryStore - Block broadcast_476_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
20:14:24.379 INFO BlockManagerInfo - Added broadcast_476_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.8 MiB)
20:14:24.379 INFO BlockManagerInfo - Removed broadcast_466_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:24.379 INFO SparkContext - Created broadcast 476 from newAPIHadoopFile at PathSplitSource.java:96
20:14:24.379 INFO BlockManagerInfo - Removed broadcast_464_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.9 MiB)
20:14:24.380 INFO BlockManagerInfo - Removed broadcast_465_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.9 MiB)
20:14:24.410 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.410 INFO FileInputFormat - Total input files to process : 1
20:14:24.410 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.446 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:24.446 INFO DAGScheduler - Got job 178 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:24.446 INFO DAGScheduler - Final stage: ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:24.446 INFO DAGScheduler - Parents of final stage: List()
20:14:24.446 INFO DAGScheduler - Missing parents: List()
20:14:24.447 INFO DAGScheduler - Submitting ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:24.470 INFO MemoryStore - Block broadcast_477 stored as values in memory (estimated size 426.2 KiB, free 1918.9 MiB)
20:14:24.471 INFO MemoryStore - Block broadcast_477_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1918.8 MiB)
20:14:24.471 INFO BlockManagerInfo - Added broadcast_477_piece0 in memory on localhost:35739 (size: 153.7 KiB, free: 1919.8 MiB)
20:14:24.471 INFO SparkContext - Created broadcast 477 from broadcast at DAGScheduler.scala:1580
20:14:24.471 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:24.471 INFO TaskSchedulerImpl - Adding task set 237.0 with 1 tasks resource profile 0
20:14:24.472 INFO TaskSetManager - Starting task 0.0 in stage 237.0 (TID 293) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:24.472 INFO Executor - Running task 0.0 in stage 237.0 (TID 293)
20:14:24.501 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam:0+237038
20:14:24.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.503 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.504 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.504 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.bai dst=null perm=null proto=rpc
20:14:24.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bai dst=null perm=null proto=rpc
20:14:24.506 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.507 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.508 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.508 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.515 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.515 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.516 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.516 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.517 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.518 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.519 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.519 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.520 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.521 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.521 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.522 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.523 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.524 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.524 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.525 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.525 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.526 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.527 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.527 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.528 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.529 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.530 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.530 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.531 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.532 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.533 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.534 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.534 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.535 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.535 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.536 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.536 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.538 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.539 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.539 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.541 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.541 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.542 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.542 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.543 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.544 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.544 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.545 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.546 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.547 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.547 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.549 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.549 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.550 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.550 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.552 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.552 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.553 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.553 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.555 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.555 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.556 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.556 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.557 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.557 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.557 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.558 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.559 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.bai dst=null perm=null proto=rpc
20:14:24.559 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bai dst=null perm=null proto=rpc
20:14:24.560 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.563 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:24.566 INFO Executor - Finished task 0.0 in stage 237.0 (TID 293). 651526 bytes result sent to driver
20:14:24.568 INFO TaskSetManager - Finished task 0.0 in stage 237.0 (TID 293) in 96 ms on localhost (executor driver) (1/1)
20:14:24.568 INFO TaskSchedulerImpl - Removed TaskSet 237.0, whose tasks have all completed, from pool
20:14:24.568 INFO DAGScheduler - ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.121 s
20:14:24.568 INFO DAGScheduler - Job 178 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:24.568 INFO TaskSchedulerImpl - Killing all running tasks in stage 237: Stage finished
20:14:24.568 INFO DAGScheduler - Job 178 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.121971 s
20:14:24.583 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:24.583 INFO DAGScheduler - Got job 179 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:24.583 INFO DAGScheduler - Final stage: ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185)
20:14:24.583 INFO DAGScheduler - Parents of final stage: List()
20:14:24.583 INFO DAGScheduler - Missing parents: List()
20:14:24.583 INFO DAGScheduler - Submitting ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:24.607 INFO MemoryStore - Block broadcast_478 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
20:14:24.608 INFO MemoryStore - Block broadcast_478_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
20:14:24.608 INFO BlockManagerInfo - Added broadcast_478_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.6 MiB)
20:14:24.608 INFO SparkContext - Created broadcast 478 from broadcast at DAGScheduler.scala:1580
20:14:24.608 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:24.608 INFO TaskSchedulerImpl - Adding task set 238.0 with 1 tasks resource profile 0
20:14:24.609 INFO TaskSetManager - Starting task 0.0 in stage 238.0 (TID 294) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:24.609 INFO Executor - Running task 0.0 in stage 238.0 (TID 294)
20:14:24.637 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:24.646 INFO Executor - Finished task 0.0 in stage 238.0 (TID 294). 989 bytes result sent to driver
20:14:24.646 INFO TaskSetManager - Finished task 0.0 in stage 238.0 (TID 294) in 37 ms on localhost (executor driver) (1/1)
20:14:24.647 INFO TaskSchedulerImpl - Removed TaskSet 238.0, whose tasks have all completed, from pool
20:14:24.647 INFO DAGScheduler - ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
20:14:24.647 INFO DAGScheduler - Job 179 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:24.647 INFO TaskSchedulerImpl - Killing all running tasks in stage 238: Stage finished
20:14:24.647 INFO DAGScheduler - Job 179 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063789 s
20:14:24.650 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:24.650 INFO DAGScheduler - Got job 180 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:24.650 INFO DAGScheduler - Final stage: ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185)
20:14:24.650 INFO DAGScheduler - Parents of final stage: List()
20:14:24.650 INFO DAGScheduler - Missing parents: List()
20:14:24.650 INFO DAGScheduler - Submitting ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:24.667 INFO MemoryStore - Block broadcast_479 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
20:14:24.668 INFO MemoryStore - Block broadcast_479_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
20:14:24.668 INFO BlockManagerInfo - Added broadcast_479_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.5 MiB)
20:14:24.668 INFO SparkContext - Created broadcast 479 from broadcast at DAGScheduler.scala:1580
20:14:24.668 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:24.668 INFO TaskSchedulerImpl - Adding task set 239.0 with 1 tasks resource profile 0
20:14:24.669 INFO TaskSetManager - Starting task 0.0 in stage 239.0 (TID 295) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:24.669 INFO Executor - Running task 0.0 in stage 239.0 (TID 295)
20:14:24.697 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam:0+237038
20:14:24.697 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.698 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.699 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.699 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.699 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.700 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.bai dst=null perm=null proto=rpc
20:14:24.700 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bai dst=null perm=null proto=rpc
20:14:24.701 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.702 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.703 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.703 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.704 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.708 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.708 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.709 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.710 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.711 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.711 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.712 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.712 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.714 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.714 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.715 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.715 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.716 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.716 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.717 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.717 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.718 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.719 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.720 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.720 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.721 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.721 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.723 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.723 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.724 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.724 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.725 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.726 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.726 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.727 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.727 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.728 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.729 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.730 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.730 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.731 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.732 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.732 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.734 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.735 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.735 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.736 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.737 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.737 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.738 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.739 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.740 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.740 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.741 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.742 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.742 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.743 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.744 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.745 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.745 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.746 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.747 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.747 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.748 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.749 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.749 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.751 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
20:14:24.751 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.752 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.753 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam dst=null perm=null proto=rpc
20:14:24.753 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bam.bai dst=null perm=null proto=rpc
20:14:24.754 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a6981c0e-f169-4b09-b7d9-98135ca4dec6.bai dst=null perm=null proto=rpc
20:14:24.755 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:24.758 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
20:14:24.758 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:24.760 INFO Executor - Finished task 0.0 in stage 239.0 (TID 295). 989 bytes result sent to driver
20:14:24.760 INFO TaskSetManager - Finished task 0.0 in stage 239.0 (TID 295) in 91 ms on localhost (executor driver) (1/1)
20:14:24.760 INFO TaskSchedulerImpl - Removed TaskSet 239.0, whose tasks have all completed, from pool
20:14:24.761 INFO DAGScheduler - ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.110 s
20:14:24.761 INFO DAGScheduler - Job 180 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:24.761 INFO TaskSchedulerImpl - Killing all running tasks in stage 239: Stage finished
20:14:24.761 INFO DAGScheduler - Job 180 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.110612 s
20:14:24.769 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:24.770 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.771 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.771 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:24.773 INFO MemoryStore - Block broadcast_480 stored as values in memory (estimated size 298.0 KiB, free 1917.3 MiB)
20:14:24.779 INFO MemoryStore - Block broadcast_480_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.3 MiB)
20:14:24.779 INFO BlockManagerInfo - Added broadcast_480_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:14:24.780 INFO SparkContext - Created broadcast 480 from newAPIHadoopFile at PathSplitSource.java:96
20:14:24.801 INFO MemoryStore - Block broadcast_481 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
20:14:24.807 INFO MemoryStore - Block broadcast_481_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.9 MiB)
20:14:24.807 INFO BlockManagerInfo - Added broadcast_481_piece0 in memory on localhost:35739 (size: 50.3 KiB, free: 1919.4 MiB)
20:14:24.807 INFO SparkContext - Created broadcast 481 from newAPIHadoopFile at PathSplitSource.java:96
20:14:24.827 INFO FileInputFormat - Total input files to process : 1
20:14:24.829 INFO MemoryStore - Block broadcast_482 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
20:14:24.829 INFO MemoryStore - Block broadcast_482_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
20:14:24.830 INFO BlockManagerInfo - Added broadcast_482_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:24.830 INFO SparkContext - Created broadcast 482 from broadcast at ReadsSparkSink.java:133
20:14:24.831 INFO MemoryStore - Block broadcast_483 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
20:14:24.832 INFO MemoryStore - Block broadcast_483_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
20:14:24.832 INFO BlockManagerInfo - Added broadcast_483_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:24.832 INFO SparkContext - Created broadcast 483 from broadcast at BamSink.java:76
20:14:24.834 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts dst=null perm=null proto=rpc
20:14:24.834 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:24.834 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:24.834 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:24.835 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:24.841 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:24.841 INFO DAGScheduler - Registering RDD 1155 (mapToPair at SparkUtils.java:161) as input to shuffle 48
20:14:24.841 INFO DAGScheduler - Got job 181 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:24.841 INFO DAGScheduler - Final stage: ResultStage 241 (runJob at SparkHadoopWriter.scala:83)
20:14:24.841 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 240)
20:14:24.841 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 240)
20:14:24.841 INFO DAGScheduler - Submitting ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:24.858 INFO MemoryStore - Block broadcast_484 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
20:14:24.859 INFO MemoryStore - Block broadcast_484_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
20:14:24.859 INFO BlockManagerInfo - Added broadcast_484_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.2 MiB)
20:14:24.860 INFO SparkContext - Created broadcast 484 from broadcast at DAGScheduler.scala:1580
20:14:24.860 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:24.860 INFO TaskSchedulerImpl - Adding task set 240.0 with 1 tasks resource profile 0
20:14:24.860 INFO TaskSetManager - Starting task 0.0 in stage 240.0 (TID 296) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
20:14:24.861 INFO Executor - Running task 0.0 in stage 240.0 (TID 296)
20:14:24.891 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:14:24.907 INFO Executor - Finished task 0.0 in stage 240.0 (TID 296). 1148 bytes result sent to driver
20:14:24.907 INFO TaskSetManager - Finished task 0.0 in stage 240.0 (TID 296) in 47 ms on localhost (executor driver) (1/1)
20:14:24.907 INFO TaskSchedulerImpl - Removed TaskSet 240.0, whose tasks have all completed, from pool
20:14:24.907 INFO DAGScheduler - ShuffleMapStage 240 (mapToPair at SparkUtils.java:161) finished in 0.065 s
20:14:24.907 INFO DAGScheduler - looking for newly runnable stages
20:14:24.908 INFO DAGScheduler - running: HashSet()
20:14:24.908 INFO DAGScheduler - waiting: HashSet(ResultStage 241)
20:14:24.908 INFO DAGScheduler - failed: HashSet()
20:14:24.908 INFO DAGScheduler - Submitting ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91), which has no missing parents
20:14:24.914 INFO MemoryStore - Block broadcast_485 stored as values in memory (estimated size 241.5 KiB, free 1915.7 MiB)
20:14:24.915 INFO MemoryStore - Block broadcast_485_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.6 MiB)
20:14:24.915 INFO BlockManagerInfo - Added broadcast_485_piece0 in memory on localhost:35739 (size: 67.1 KiB, free: 1919.1 MiB)
20:14:24.915 INFO SparkContext - Created broadcast 485 from broadcast at DAGScheduler.scala:1580
20:14:24.916 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:24.916 INFO TaskSchedulerImpl - Adding task set 241.0 with 1 tasks resource profile 0
20:14:24.916 INFO TaskSetManager - Starting task 0.0 in stage 241.0 (TID 297) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:24.916 INFO Executor - Running task 0.0 in stage 241.0 (TID 297)
20:14:24.920 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:24.920 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:24.931 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:24.931 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:24.931 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:24.931 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:24.931 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:24.931 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:24.932 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.933 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.934 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:24.936 INFO StateChange - BLOCK* allocate blk_1073741898_1074, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/part-r-00000
20:14:24.937 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741898_1074 src: /127.0.0.1:36584 dest: /127.0.0.1:38353
20:14:24.938 INFO clienttrace - src: /127.0.0.1:36584, dest: /127.0.0.1:38353, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741898_1074, duration(ns): 1039727
20:14:24.939 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741898_1074, type=LAST_IN_PIPELINE terminating
20:14:24.939 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.940 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:14:24.940 INFO StateChange - BLOCK* allocate blk_1073741899_1075, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/.part-r-00000.sbi
20:14:24.941 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741899_1075 src: /127.0.0.1:36600 dest: /127.0.0.1:38353
20:14:24.942 INFO clienttrace - src: /127.0.0.1:36600, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741899_1075, duration(ns): 329066
20:14:24.942 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741899_1075, type=LAST_IN_PIPELINE terminating
20:14:24.943 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:24.944 INFO StateChange - BLOCK* allocate blk_1073741900_1076, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/.part-r-00000.bai
20:14:24.945 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741900_1076 src: /127.0.0.1:36606 dest: /127.0.0.1:38353
20:14:24.946 INFO clienttrace - src: /127.0.0.1:36606, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741900_1076, duration(ns): 449472
20:14:24.946 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741900_1076, type=LAST_IN_PIPELINE terminating
20:14:24.947 INFO FSNamesystem - BLOCK* blk_1073741900_1076 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/.part-r-00000.bai
20:14:25.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741891_1067 replica FinalizedReplica, blk_1073741891_1067, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741891 for deletion
20:14:25.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741891_1067 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741891
20:14:25.347 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:25.348 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0 dst=null perm=null proto=rpc
20:14:25.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0 dst=null perm=null proto=rpc
20:14:25.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/task_202502102014248230797469997224707_1160_r_000000 dst=null perm=null proto=rpc
20:14:25.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/_temporary/attempt_202502102014248230797469997224707_1160_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/task_202502102014248230797469997224707_1160_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:25.350 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014248230797469997224707_1160_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/task_202502102014248230797469997224707_1160_r_000000
20:14:25.350 INFO SparkHadoopMapRedUtil - attempt_202502102014248230797469997224707_1160_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:25.350 INFO Executor - Finished task 0.0 in stage 241.0 (TID 297). 1858 bytes result sent to driver
20:14:25.350 INFO TaskSetManager - Finished task 0.0 in stage 241.0 (TID 297) in 434 ms on localhost (executor driver) (1/1)
20:14:25.351 INFO TaskSchedulerImpl - Removed TaskSet 241.0, whose tasks have all completed, from pool
20:14:25.351 INFO DAGScheduler - ResultStage 241 (runJob at SparkHadoopWriter.scala:83) finished in 0.443 s
20:14:25.351 INFO DAGScheduler - Job 181 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:25.351 INFO TaskSchedulerImpl - Killing all running tasks in stage 241: Stage finished
20:14:25.351 INFO DAGScheduler - Job 181 finished: runJob at SparkHadoopWriter.scala:83, took 0.510167 s
20:14:25.351 INFO SparkHadoopWriter - Start to commit write Job job_202502102014248230797469997224707_1160.
20:14:25.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:25.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts dst=null perm=null proto=rpc
20:14:25.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/task_202502102014248230797469997224707_1160_r_000000 dst=null perm=null proto=rpc
20:14:25.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:25.353 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/task_202502102014248230797469997224707_1160_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:25.353 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:25.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/task_202502102014248230797469997224707_1160_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:25.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:25.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary/0/task_202502102014248230797469997224707_1160_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:25.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_temporary dst=null perm=null proto=rpc
20:14:25.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:25.356 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:25.357 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.spark-staging-1160 dst=null perm=null proto=rpc
20:14:25.357 INFO SparkHadoopWriter - Write Job job_202502102014248230797469997224707_1160 committed. Elapsed time: 5 ms.
20:14:25.357 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:25.358 INFO StateChange - BLOCK* allocate blk_1073741901_1077, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/header
20:14:25.359 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741901_1077 src: /127.0.0.1:36610 dest: /127.0.0.1:38353
20:14:25.360 INFO clienttrace - src: /127.0.0.1:36610, dest: /127.0.0.1:38353, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741901_1077, duration(ns): 441038
20:14:25.360 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741901_1077, type=LAST_IN_PIPELINE terminating
20:14:25.361 INFO FSNamesystem - BLOCK* blk_1073741901_1077 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/header
20:14:25.761 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:25.762 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:25.763 INFO StateChange - BLOCK* allocate blk_1073741902_1078, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/terminator
20:14:25.764 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741902_1078 src: /127.0.0.1:36622 dest: /127.0.0.1:38353
20:14:25.765 INFO clienttrace - src: /127.0.0.1:36622, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741902_1078, duration(ns): 387543
20:14:25.765 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741902_1078, type=LAST_IN_PIPELINE terminating
20:14:25.765 INFO FSNamesystem - BLOCK* blk_1073741902_1078 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/terminator
20:14:26.166 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:26.167 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts dst=null perm=null proto=rpc
20:14:26.168 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.168 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:26.169 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam
20:14:26.169 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.169 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.170 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.170 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.171 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam done
20:14:26.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.171 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi
20:14:26.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts dst=null perm=null proto=rpc
20:14:26.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:26.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:26.174 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:26.174 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:26.175 INFO StateChange - BLOCK* allocate blk_1073741903_1079, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi
20:14:26.175 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741903_1079 src: /127.0.0.1:36638 dest: /127.0.0.1:38353
20:14:26.176 INFO clienttrace - src: /127.0.0.1:36638, dest: /127.0.0.1:38353, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741903_1079, duration(ns): 349595
20:14:26.177 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741903_1079, type=LAST_IN_PIPELINE terminating
20:14:26.177 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:26.177 INFO IndexFileMerger - Done merging .sbi files
20:14:26.177 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai
20:14:26.178 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts dst=null perm=null proto=rpc
20:14:26.178 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.179 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:26.179 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:26.180 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:26.180 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:26.181 INFO StateChange - BLOCK* allocate blk_1073741904_1080, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai
20:14:26.182 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741904_1080 src: /127.0.0.1:36642 dest: /127.0.0.1:38353
20:14:26.183 INFO clienttrace - src: /127.0.0.1:36642, dest: /127.0.0.1:38353, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741904_1080, duration(ns): 330158
20:14:26.183 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741904_1080, type=LAST_IN_PIPELINE terminating
20:14:26.183 INFO FSNamesystem - BLOCK* blk_1073741904_1080 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai
20:14:26.584 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:26.584 INFO IndexFileMerger - Done merging .bai files
20:14:26.584 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.parts dst=null perm=null proto=rpc
20:14:26.593 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.600 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi dst=null perm=null proto=rpc
20:14:26.600 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi dst=null perm=null proto=rpc
20:14:26.601 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi dst=null perm=null proto=rpc
20:14:26.601 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:26.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.603 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.603 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.603 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.604 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.605 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:26.609 INFO BlockManagerInfo - Removed broadcast_479_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.3 MiB)
20:14:26.610 INFO BlockManagerInfo - Removed broadcast_483_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.3 MiB)
20:14:26.610 INFO BlockManagerInfo - Removed broadcast_485_piece0 on localhost:35739 in memory (size: 67.1 KiB, free: 1919.3 MiB)
20:14:26.611 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:26.611 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
20:14:26.611 INFO BlockManagerInfo - Removed broadcast_484_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.5 MiB)
20:14:26.611 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi dst=null perm=null proto=rpc
20:14:26.611 INFO BlockManagerInfo - Removed broadcast_481_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.5 MiB)
20:14:26.612 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi dst=null perm=null proto=rpc
20:14:26.612 INFO BlockManagerInfo - Removed broadcast_478_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:26.612 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.sbi dst=null perm=null proto=rpc
20:14:26.612 INFO BlockManagerInfo - Removed broadcast_477_piece0 on localhost:35739 in memory (size: 153.7 KiB, free: 1919.8 MiB)
20:14:26.612 INFO BlockManagerInfo - Removed broadcast_482_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.9 MiB)
20:14:26.613 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
20:14:26.613 INFO MemoryStore - Block broadcast_486 stored as values in memory (estimated size 320.0 B, free 1919.3 MiB)
20:14:26.613 INFO BlockManagerInfo - Removed broadcast_476_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:26.613 INFO BlockManagerInfo - Removed broadcast_470_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1920.0 MiB)
20:14:26.613 INFO MemoryStore - Block broadcast_486_piece0 stored as bytes in memory (estimated size 233.0 B, free 1919.7 MiB)
20:14:26.614 INFO BlockManagerInfo - Added broadcast_486_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1920.0 MiB)
20:14:26.614 INFO SparkContext - Created broadcast 486 from broadcast at BamSource.java:104
20:14:26.615 INFO MemoryStore - Block broadcast_487 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
20:14:26.621 INFO MemoryStore - Block broadcast_487_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:14:26.621 INFO BlockManagerInfo - Added broadcast_487_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:26.621 INFO SparkContext - Created broadcast 487 from newAPIHadoopFile at PathSplitSource.java:96
20:14:26.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.630 INFO FileInputFormat - Total input files to process : 1
20:14:26.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.644 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:26.645 INFO DAGScheduler - Got job 182 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:26.645 INFO DAGScheduler - Final stage: ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:26.645 INFO DAGScheduler - Parents of final stage: List()
20:14:26.645 INFO DAGScheduler - Missing parents: List()
20:14:26.645 INFO DAGScheduler - Submitting ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:26.651 INFO MemoryStore - Block broadcast_488 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
20:14:26.651 INFO MemoryStore - Block broadcast_488_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
20:14:26.652 INFO BlockManagerInfo - Added broadcast_488_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.8 MiB)
20:14:26.652 INFO SparkContext - Created broadcast 488 from broadcast at DAGScheduler.scala:1580
20:14:26.652 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:26.652 INFO TaskSchedulerImpl - Adding task set 242.0 with 1 tasks resource profile 0
20:14:26.652 INFO TaskSetManager - Starting task 0.0 in stage 242.0 (TID 298) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:26.653 INFO Executor - Running task 0.0 in stage 242.0 (TID 298)
20:14:26.664 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam:0+235514
20:14:26.664 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.665 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.665 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.666 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.666 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.668 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:26.669 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:26.670 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:26.671 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
20:14:26.671 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:26.673 INFO Executor - Finished task 0.0 in stage 242.0 (TID 298). 650141 bytes result sent to driver
20:14:26.674 INFO TaskSetManager - Finished task 0.0 in stage 242.0 (TID 298) in 22 ms on localhost (executor driver) (1/1)
20:14:26.675 INFO TaskSchedulerImpl - Removed TaskSet 242.0, whose tasks have all completed, from pool
20:14:26.675 INFO DAGScheduler - ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.030 s
20:14:26.675 INFO DAGScheduler - Job 182 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:26.675 INFO TaskSchedulerImpl - Killing all running tasks in stage 242: Stage finished
20:14:26.675 INFO DAGScheduler - Job 182 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.030344 s
20:14:26.685 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:26.685 INFO DAGScheduler - Got job 183 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:26.685 INFO DAGScheduler - Final stage: ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185)
20:14:26.685 INFO DAGScheduler - Parents of final stage: List()
20:14:26.685 INFO DAGScheduler - Missing parents: List()
20:14:26.685 INFO DAGScheduler - Submitting ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:26.702 INFO MemoryStore - Block broadcast_489 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
20:14:26.703 INFO MemoryStore - Block broadcast_489_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
20:14:26.703 INFO BlockManagerInfo - Added broadcast_489_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:14:26.703 INFO SparkContext - Created broadcast 489 from broadcast at DAGScheduler.scala:1580
20:14:26.703 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:26.703 INFO TaskSchedulerImpl - Adding task set 243.0 with 1 tasks resource profile 0
20:14:26.704 INFO TaskSetManager - Starting task 0.0 in stage 243.0 (TID 299) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
20:14:26.704 INFO Executor - Running task 0.0 in stage 243.0 (TID 299)
20:14:26.732 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
20:14:26.743 INFO Executor - Finished task 0.0 in stage 243.0 (TID 299). 989 bytes result sent to driver
20:14:26.744 INFO TaskSetManager - Finished task 0.0 in stage 243.0 (TID 299) in 40 ms on localhost (executor driver) (1/1)
20:14:26.744 INFO TaskSchedulerImpl - Removed TaskSet 243.0, whose tasks have all completed, from pool
20:14:26.744 INFO DAGScheduler - ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.059 s
20:14:26.744 INFO DAGScheduler - Job 183 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:26.744 INFO TaskSchedulerImpl - Killing all running tasks in stage 243: Stage finished
20:14:26.744 INFO DAGScheduler - Job 183 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059492 s
20:14:26.747 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:26.748 INFO DAGScheduler - Got job 184 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:26.748 INFO DAGScheduler - Final stage: ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185)
20:14:26.748 INFO DAGScheduler - Parents of final stage: List()
20:14:26.748 INFO DAGScheduler - Missing parents: List()
20:14:26.748 INFO DAGScheduler - Submitting ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:26.754 INFO MemoryStore - Block broadcast_490 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
20:14:26.754 INFO MemoryStore - Block broadcast_490_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
20:14:26.755 INFO BlockManagerInfo - Added broadcast_490_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:26.755 INFO SparkContext - Created broadcast 490 from broadcast at DAGScheduler.scala:1580
20:14:26.755 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:26.755 INFO TaskSchedulerImpl - Adding task set 244.0 with 1 tasks resource profile 0
20:14:26.755 INFO TaskSetManager - Starting task 0.0 in stage 244.0 (TID 300) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:26.756 INFO Executor - Running task 0.0 in stage 244.0 (TID 300)
20:14:26.766 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam:0+235514
20:14:26.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam dst=null perm=null proto=rpc
20:14:26.768 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.768 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.769 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_de635446-e563-45e0-8a9d-412005d52559.bam.bai dst=null perm=null proto=rpc
20:14:26.770 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
20:14:26.771 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:26.772 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
20:14:26.773 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
20:14:26.773 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:26.774 INFO Executor - Finished task 0.0 in stage 244.0 (TID 300). 989 bytes result sent to driver
20:14:26.774 INFO TaskSetManager - Finished task 0.0 in stage 244.0 (TID 300) in 19 ms on localhost (executor driver) (1/1)
20:14:26.774 INFO TaskSchedulerImpl - Removed TaskSet 244.0, whose tasks have all completed, from pool
20:14:26.774 INFO DAGScheduler - ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
20:14:26.775 INFO DAGScheduler - Job 184 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:26.775 INFO TaskSchedulerImpl - Killing all running tasks in stage 244: Stage finished
20:14:26.775 INFO DAGScheduler - Job 184 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027208 s
20:14:26.783 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:26.784 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.784 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:26.785 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:26.786 INFO MemoryStore - Block broadcast_491 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
20:14:26.793 INFO MemoryStore - Block broadcast_491_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:26.793 INFO BlockManagerInfo - Added broadcast_491_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:26.793 INFO SparkContext - Created broadcast 491 from newAPIHadoopFile at PathSplitSource.java:96
20:14:26.814 INFO MemoryStore - Block broadcast_492 stored as values in memory (estimated size 298.0 KiB, free 1917.7 MiB)
20:14:26.820 INFO MemoryStore - Block broadcast_492_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
20:14:26.820 INFO BlockManagerInfo - Added broadcast_492_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:26.820 INFO SparkContext - Created broadcast 492 from newAPIHadoopFile at PathSplitSource.java:96
20:14:26.840 INFO FileInputFormat - Total input files to process : 1
20:14:26.841 INFO MemoryStore - Block broadcast_493 stored as values in memory (estimated size 19.6 KiB, free 1917.7 MiB)
20:14:26.841 INFO MemoryStore - Block broadcast_493_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.7 MiB)
20:14:26.842 INFO BlockManagerInfo - Added broadcast_493_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.5 MiB)
20:14:26.842 INFO SparkContext - Created broadcast 493 from broadcast at ReadsSparkSink.java:133
20:14:26.842 INFO MemoryStore - Block broadcast_494 stored as values in memory (estimated size 20.0 KiB, free 1917.6 MiB)
20:14:26.843 INFO MemoryStore - Block broadcast_494_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.6 MiB)
20:14:26.843 INFO BlockManagerInfo - Added broadcast_494_piece0 in memory on localhost:35739 (size: 1890.0 B, free: 1919.5 MiB)
20:14:26.843 INFO SparkContext - Created broadcast 494 from broadcast at BamSink.java:76
20:14:26.845 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts dst=null perm=null proto=rpc
20:14:26.845 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:26.845 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:26.845 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:26.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:26.852 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:26.852 INFO DAGScheduler - Registering RDD 1180 (mapToPair at SparkUtils.java:161) as input to shuffle 49
20:14:26.852 INFO DAGScheduler - Got job 185 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:26.852 INFO DAGScheduler - Final stage: ResultStage 246 (runJob at SparkHadoopWriter.scala:83)
20:14:26.852 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 245)
20:14:26.853 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 245)
20:14:26.853 INFO DAGScheduler - Submitting ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:26.869 INFO MemoryStore - Block broadcast_495 stored as values in memory (estimated size 434.3 KiB, free 1917.2 MiB)
20:14:26.871 INFO MemoryStore - Block broadcast_495_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1917.1 MiB)
20:14:26.871 INFO BlockManagerInfo - Added broadcast_495_piece0 in memory on localhost:35739 (size: 157.6 KiB, free: 1919.4 MiB)
20:14:26.871 INFO SparkContext - Created broadcast 495 from broadcast at DAGScheduler.scala:1580
20:14:26.871 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:26.871 INFO TaskSchedulerImpl - Adding task set 245.0 with 1 tasks resource profile 0
20:14:26.872 INFO TaskSetManager - Starting task 0.0 in stage 245.0 (TID 301) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
20:14:26.872 INFO Executor - Running task 0.0 in stage 245.0 (TID 301)
20:14:26.902 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:26.914 INFO Executor - Finished task 0.0 in stage 245.0 (TID 301). 1148 bytes result sent to driver
20:14:26.914 INFO TaskSetManager - Finished task 0.0 in stage 245.0 (TID 301) in 43 ms on localhost (executor driver) (1/1)
20:14:26.914 INFO TaskSchedulerImpl - Removed TaskSet 245.0, whose tasks have all completed, from pool
20:14:26.914 INFO DAGScheduler - ShuffleMapStage 245 (mapToPair at SparkUtils.java:161) finished in 0.061 s
20:14:26.914 INFO DAGScheduler - looking for newly runnable stages
20:14:26.914 INFO DAGScheduler - running: HashSet()
20:14:26.914 INFO DAGScheduler - waiting: HashSet(ResultStage 246)
20:14:26.914 INFO DAGScheduler - failed: HashSet()
20:14:26.914 INFO DAGScheduler - Submitting ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91), which has no missing parents
20:14:26.921 INFO MemoryStore - Block broadcast_496 stored as values in memory (estimated size 155.4 KiB, free 1916.9 MiB)
20:14:26.921 INFO MemoryStore - Block broadcast_496_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1916.8 MiB)
20:14:26.921 INFO BlockManagerInfo - Added broadcast_496_piece0 in memory on localhost:35739 (size: 58.5 KiB, free: 1919.3 MiB)
20:14:26.922 INFO SparkContext - Created broadcast 496 from broadcast at DAGScheduler.scala:1580
20:14:26.922 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:26.922 INFO TaskSchedulerImpl - Adding task set 246.0 with 1 tasks resource profile 0
20:14:26.922 INFO TaskSetManager - Starting task 0.0 in stage 246.0 (TID 302) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:26.922 INFO Executor - Running task 0.0 in stage 246.0 (TID 302)
20:14:26.926 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:26.926 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:26.937 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:26.937 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:26.937 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:26.937 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:26.937 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:26.937 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:26.938 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.939 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.940 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:26.942 INFO StateChange - BLOCK* allocate blk_1073741905_1081, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/part-r-00000
20:14:26.944 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741905_1081 src: /127.0.0.1:36660 dest: /127.0.0.1:38353
20:14:26.946 INFO clienttrace - src: /127.0.0.1:36660, dest: /127.0.0.1:38353, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741905_1081, duration(ns): 1039657
20:14:26.946 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741905_1081, type=LAST_IN_PIPELINE terminating
20:14:26.946 INFO FSNamesystem - BLOCK* blk_1073741905_1081 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/part-r-00000
20:14:27.347 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:27.348 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
20:14:27.348 INFO StateChange - BLOCK* allocate blk_1073741906_1082, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.sbi
20:14:27.349 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741906_1082 src: /127.0.0.1:36672 dest: /127.0.0.1:38353
20:14:27.351 INFO clienttrace - src: /127.0.0.1:36672, dest: /127.0.0.1:38353, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741906_1082, duration(ns): 404347
20:14:27.351 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741906_1082, type=LAST_IN_PIPELINE terminating
20:14:27.352 INFO FSNamesystem - BLOCK* blk_1073741906_1082 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.sbi
20:14:27.752 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:27.753 INFO StateChange - BLOCK* allocate blk_1073741907_1083, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.bai
20:14:27.754 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741907_1083 src: /127.0.0.1:45312 dest: /127.0.0.1:38353
20:14:27.755 INFO clienttrace - src: /127.0.0.1:45312, dest: /127.0.0.1:38353, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741907_1083, duration(ns): 348723
20:14:27.755 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741907_1083, type=LAST_IN_PIPELINE terminating
20:14:27.755 INFO FSNamesystem - BLOCK* blk_1073741907_1083 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.bai
20:14:28.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741899_1075 replica FinalizedReplica, blk_1073741899_1075, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741899 for deletion
20:14:28.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741900_1076 replica FinalizedReplica, blk_1073741900_1076, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741900 for deletion
20:14:28.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741899_1075 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741899
20:14:28.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741900_1076 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741900
20:14:28.156 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:28.157 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0 dst=null perm=null proto=rpc
20:14:28.157 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0 dst=null perm=null proto=rpc
20:14:28.158 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/task_202502102014268019453176007134998_1185_r_000000 dst=null perm=null proto=rpc
20:14:28.158 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/_temporary/attempt_202502102014268019453176007134998_1185_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/task_202502102014268019453176007134998_1185_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:28.158 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014268019453176007134998_1185_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/task_202502102014268019453176007134998_1185_r_000000
20:14:28.159 INFO SparkHadoopMapRedUtil - attempt_202502102014268019453176007134998_1185_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:28.159 INFO Executor - Finished task 0.0 in stage 246.0 (TID 302). 1858 bytes result sent to driver
20:14:28.159 INFO TaskSetManager - Finished task 0.0 in stage 246.0 (TID 302) in 1237 ms on localhost (executor driver) (1/1)
20:14:28.159 INFO TaskSchedulerImpl - Removed TaskSet 246.0, whose tasks have all completed, from pool
20:14:28.160 INFO DAGScheduler - ResultStage 246 (runJob at SparkHadoopWriter.scala:83) finished in 1.245 s
20:14:28.160 INFO DAGScheduler - Job 185 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:28.160 INFO TaskSchedulerImpl - Killing all running tasks in stage 246: Stage finished
20:14:28.160 INFO DAGScheduler - Job 185 finished: runJob at SparkHadoopWriter.scala:83, took 1.307884 s
20:14:28.160 INFO SparkHadoopWriter - Start to commit write Job job_202502102014268019453176007134998_1185.
20:14:28.161 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:28.161 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts dst=null perm=null proto=rpc
20:14:28.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/task_202502102014268019453176007134998_1185_r_000000 dst=null perm=null proto=rpc
20:14:28.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:28.162 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/task_202502102014268019453176007134998_1185_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.163 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:28.163 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/task_202502102014268019453176007134998_1185_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.164 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:28.164 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary/0/task_202502102014268019453176007134998_1185_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.165 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_temporary dst=null perm=null proto=rpc
20:14:28.165 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.166 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:28.166 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.spark-staging-1185 dst=null perm=null proto=rpc
20:14:28.166 INFO SparkHadoopWriter - Write Job job_202502102014268019453176007134998_1185 committed. Elapsed time: 6 ms.
20:14:28.167 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.168 INFO StateChange - BLOCK* allocate blk_1073741908_1084, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/header
20:14:28.169 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741908_1084 src: /127.0.0.1:45320 dest: /127.0.0.1:38353
20:14:28.170 INFO clienttrace - src: /127.0.0.1:45320, dest: /127.0.0.1:38353, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741908_1084, duration(ns): 397370
20:14:28.170 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741908_1084, type=LAST_IN_PIPELINE terminating
20:14:28.170 INFO FSNamesystem - BLOCK* blk_1073741908_1084 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/header
20:14:28.571 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:28.572 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.572 INFO StateChange - BLOCK* allocate blk_1073741909_1085, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/terminator
20:14:28.573 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741909_1085 src: /127.0.0.1:45332 dest: /127.0.0.1:38353
20:14:28.574 INFO clienttrace - src: /127.0.0.1:45332, dest: /127.0.0.1:38353, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741909_1085, duration(ns): 380906
20:14:28.574 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741909_1085, type=LAST_IN_PIPELINE terminating
20:14:28.575 INFO FSNamesystem - BLOCK* blk_1073741909_1085 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/terminator
20:14:28.975 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:28.976 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts dst=null perm=null proto=rpc
20:14:28.977 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.978 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:28.978 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam
20:14:28.978 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.979 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:28.979 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:28.980 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.980 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam done
20:14:28.980 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:28.980 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi
20:14:28.980 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts dst=null perm=null proto=rpc
20:14:28.981 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:28.982 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:28.982 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:28.983 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
20:14:28.984 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
20:14:28.984 INFO StateChange - BLOCK* allocate blk_1073741910_1086, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi
20:14:28.985 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741910_1086 src: /127.0.0.1:45338 dest: /127.0.0.1:38353
20:14:28.986 INFO clienttrace - src: /127.0.0.1:45338, dest: /127.0.0.1:38353, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741910_1086, duration(ns): 366145
20:14:28.986 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741910_1086, type=LAST_IN_PIPELINE terminating
20:14:28.986 INFO FSNamesystem - BLOCK* blk_1073741910_1086 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi
20:14:29.387 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:29.388 INFO IndexFileMerger - Done merging .sbi files
20:14:29.388 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/ to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai
20:14:29.388 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts dst=null perm=null proto=rpc
20:14:29.389 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:29.390 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:29.390 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:29.391 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:29.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
20:14:29.392 INFO StateChange - BLOCK* allocate blk_1073741911_1087, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai
20:14:29.393 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741911_1087 src: /127.0.0.1:45352 dest: /127.0.0.1:38353
20:14:29.394 INFO clienttrace - src: /127.0.0.1:45352, dest: /127.0.0.1:38353, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741911_1087, duration(ns): 354474
20:14:29.394 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741911_1087, type=LAST_IN_PIPELINE terminating
20:14:29.394 INFO FSNamesystem - BLOCK* blk_1073741911_1087 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai
20:14:29.795 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:29.795 INFO IndexFileMerger - Done merging .bai files
20:14:29.796 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.parts dst=null perm=null proto=rpc
20:14:29.804 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:29.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi dst=null perm=null proto=rpc
20:14:29.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi dst=null perm=null proto=rpc
20:14:29.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi dst=null perm=null proto=rpc
20:14:29.813 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
20:14:29.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.814 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.814 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.815 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:29.815 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:29.815 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:29.816 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
20:14:29.817 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:29.818 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:29.818 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
20:14:29.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi dst=null perm=null proto=rpc
20:14:29.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi dst=null perm=null proto=rpc
20:14:29.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.sbi dst=null perm=null proto=rpc
20:14:29.820 INFO MemoryStore - Block broadcast_497 stored as values in memory (estimated size 312.0 B, free 1916.8 MiB)
20:14:29.821 INFO MemoryStore - Block broadcast_497_piece0 stored as bytes in memory (estimated size 231.0 B, free 1916.8 MiB)
20:14:29.821 INFO BlockManagerInfo - Added broadcast_497_piece0 in memory on localhost:35739 (size: 231.0 B, free: 1919.3 MiB)
20:14:29.821 INFO SparkContext - Created broadcast 497 from broadcast at BamSource.java:104
20:14:29.822 INFO MemoryStore - Block broadcast_498 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
20:14:29.830 INFO BlockManagerInfo - Removed broadcast_486_piece0 on localhost:35739 in memory (size: 233.0 B, free: 1919.3 MiB)
20:14:29.830 INFO BlockManagerInfo - Removed broadcast_495_piece0 on localhost:35739 in memory (size: 157.6 KiB, free: 1919.5 MiB)
20:14:29.830 INFO BlockManagerInfo - Removed broadcast_487_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:29.831 INFO BlockManagerInfo - Removed broadcast_489_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.7 MiB)
20:14:29.831 INFO BlockManagerInfo - Removed broadcast_493_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.7 MiB)
20:14:29.832 INFO BlockManagerInfo - Removed broadcast_494_piece0 on localhost:35739 in memory (size: 1890.0 B, free: 1919.7 MiB)
20:14:29.832 INFO BlockManagerInfo - Removed broadcast_490_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.7 MiB)
20:14:29.832 INFO BlockManagerInfo - Removed broadcast_496_piece0 on localhost:35739 in memory (size: 58.5 KiB, free: 1919.8 MiB)
20:14:29.833 INFO BlockManagerInfo - Removed broadcast_480_piece0 on localhost:35739 in memory (size: 50.3 KiB, free: 1919.8 MiB)
20:14:29.833 INFO BlockManagerInfo - Removed broadcast_492_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:29.834 INFO BlockManagerInfo - Removed broadcast_488_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1920.0 MiB)
20:14:29.837 INFO MemoryStore - Block broadcast_498_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:14:29.837 INFO BlockManagerInfo - Added broadcast_498_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:29.837 INFO SparkContext - Created broadcast 498 from newAPIHadoopFile at PathSplitSource.java:96
20:14:29.852 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.852 INFO FileInputFormat - Total input files to process : 1
20:14:29.852 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.873 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:29.873 INFO DAGScheduler - Got job 186 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:29.873 INFO DAGScheduler - Final stage: ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:29.873 INFO DAGScheduler - Parents of final stage: List()
20:14:29.873 INFO DAGScheduler - Missing parents: List()
20:14:29.873 INFO DAGScheduler - Submitting ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:29.883 INFO MemoryStore - Block broadcast_499 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
20:14:29.883 INFO MemoryStore - Block broadcast_499_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
20:14:29.883 INFO BlockManagerInfo - Added broadcast_499_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.8 MiB)
20:14:29.884 INFO SparkContext - Created broadcast 499 from broadcast at DAGScheduler.scala:1580
20:14:29.884 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:29.884 INFO TaskSchedulerImpl - Adding task set 247.0 with 1 tasks resource profile 0
20:14:29.884 INFO TaskSetManager - Starting task 0.0 in stage 247.0 (TID 303) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:29.885 INFO Executor - Running task 0.0 in stage 247.0 (TID 303)
20:14:29.896 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam:0+236517
20:14:29.897 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.897 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:29.898 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:29.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:29.899 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:29.900 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
20:14:29.901 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:29.902 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:29.903 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
20:14:29.903 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:29.905 INFO Executor - Finished task 0.0 in stage 247.0 (TID 303). 749470 bytes result sent to driver
20:14:29.907 INFO TaskSetManager - Finished task 0.0 in stage 247.0 (TID 303) in 23 ms on localhost (executor driver) (1/1)
20:14:29.908 INFO TaskSchedulerImpl - Removed TaskSet 247.0, whose tasks have all completed, from pool
20:14:29.908 INFO DAGScheduler - ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.035 s
20:14:29.908 INFO DAGScheduler - Job 186 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:29.908 INFO TaskSchedulerImpl - Killing all running tasks in stage 247: Stage finished
20:14:29.908 INFO DAGScheduler - Job 186 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.035210 s
20:14:29.918 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:29.918 INFO DAGScheduler - Got job 187 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:29.918 INFO DAGScheduler - Final stage: ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185)
20:14:29.918 INFO DAGScheduler - Parents of final stage: List()
20:14:29.918 INFO DAGScheduler - Missing parents: List()
20:14:29.918 INFO DAGScheduler - Submitting ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:29.935 INFO MemoryStore - Block broadcast_500 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
20:14:29.936 INFO MemoryStore - Block broadcast_500_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
20:14:29.936 INFO BlockManagerInfo - Added broadcast_500_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:14:29.936 INFO SparkContext - Created broadcast 500 from broadcast at DAGScheduler.scala:1580
20:14:29.936 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:29.937 INFO TaskSchedulerImpl - Adding task set 248.0 with 1 tasks resource profile 0
20:14:29.937 INFO TaskSetManager - Starting task 0.0 in stage 248.0 (TID 304) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
20:14:29.937 INFO Executor - Running task 0.0 in stage 248.0 (TID 304)
20:14:29.966 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
20:14:29.973 INFO Executor - Finished task 0.0 in stage 248.0 (TID 304). 989 bytes result sent to driver
20:14:29.974 INFO TaskSetManager - Finished task 0.0 in stage 248.0 (TID 304) in 37 ms on localhost (executor driver) (1/1)
20:14:29.974 INFO TaskSchedulerImpl - Removed TaskSet 248.0, whose tasks have all completed, from pool
20:14:29.974 INFO DAGScheduler - ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.056 s
20:14:29.974 INFO DAGScheduler - Job 187 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:29.974 INFO TaskSchedulerImpl - Killing all running tasks in stage 248: Stage finished
20:14:29.974 INFO DAGScheduler - Job 187 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.056163 s
20:14:29.977 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:29.978 INFO DAGScheduler - Got job 188 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:29.978 INFO DAGScheduler - Final stage: ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185)
20:14:29.978 INFO DAGScheduler - Parents of final stage: List()
20:14:29.978 INFO DAGScheduler - Missing parents: List()
20:14:29.978 INFO DAGScheduler - Submitting ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:29.984 INFO MemoryStore - Block broadcast_501 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
20:14:29.984 INFO MemoryStore - Block broadcast_501_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
20:14:29.984 INFO BlockManagerInfo - Added broadcast_501_piece0 in memory on localhost:35739 (size: 54.6 KiB, free: 1919.6 MiB)
20:14:29.985 INFO SparkContext - Created broadcast 501 from broadcast at DAGScheduler.scala:1580
20:14:29.985 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:29.985 INFO TaskSchedulerImpl - Adding task set 249.0 with 1 tasks resource profile 0
20:14:29.985 INFO TaskSetManager - Starting task 0.0 in stage 249.0 (TID 305) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:29.985 INFO Executor - Running task 0.0 in stage 249.0 (TID 305)
20:14:30.000 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam:0+236517
20:14:30.001 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:30.002 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam dst=null perm=null proto=rpc
20:14:30.002 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:30.002 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:30.003 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_c2eb787a-4073-4a7c-84e1-dd4d9d2613f4.bam.bai dst=null perm=null proto=rpc
20:14:30.004 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
20:14:30.005 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:30.005 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
20:14:30.006 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
20:14:30.007 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
20:14:30.008 INFO Executor - Finished task 0.0 in stage 249.0 (TID 305). 989 bytes result sent to driver
20:14:30.008 INFO TaskSetManager - Finished task 0.0 in stage 249.0 (TID 305) in 23 ms on localhost (executor driver) (1/1)
20:14:30.008 INFO TaskSchedulerImpl - Removed TaskSet 249.0, whose tasks have all completed, from pool
20:14:30.009 INFO DAGScheduler - ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.031 s
20:14:30.009 INFO DAGScheduler - Job 188 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:30.009 INFO TaskSchedulerImpl - Killing all running tasks in stage 249: Stage finished
20:14:30.009 INFO DAGScheduler - Job 188 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.031416 s
20:14:30.018 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:30.018 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:30.019 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:30.019 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:30.021 INFO MemoryStore - Block broadcast_502 stored as values in memory (estimated size 576.0 B, free 1918.4 MiB)
20:14:30.021 INFO MemoryStore - Block broadcast_502_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.4 MiB)
20:14:30.022 INFO BlockManagerInfo - Added broadcast_502_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.6 MiB)
20:14:30.022 INFO SparkContext - Created broadcast 502 from broadcast at CramSource.java:114
20:14:30.023 INFO MemoryStore - Block broadcast_503 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
20:14:30.029 INFO MemoryStore - Block broadcast_503_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
20:14:30.029 INFO BlockManagerInfo - Added broadcast_503_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:30.029 INFO SparkContext - Created broadcast 503 from newAPIHadoopFile at PathSplitSource.java:96
20:14:30.044 INFO MemoryStore - Block broadcast_504 stored as values in memory (estimated size 576.0 B, free 1918.0 MiB)
20:14:30.044 INFO MemoryStore - Block broadcast_504_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.0 MiB)
20:14:30.045 INFO BlockManagerInfo - Added broadcast_504_piece0 in memory on localhost:35739 (size: 228.0 B, free: 1919.6 MiB)
20:14:30.045 INFO SparkContext - Created broadcast 504 from broadcast at CramSource.java:114
20:14:30.045 INFO MemoryStore - Block broadcast_505 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
20:14:30.051 INFO MemoryStore - Block broadcast_505_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
20:14:30.051 INFO BlockManagerInfo - Added broadcast_505_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:30.052 INFO SparkContext - Created broadcast 505 from newAPIHadoopFile at PathSplitSource.java:96
20:14:30.065 INFO FileInputFormat - Total input files to process : 1
20:14:30.067 INFO MemoryStore - Block broadcast_506 stored as values in memory (estimated size 6.0 KiB, free 1917.7 MiB)
20:14:30.067 INFO MemoryStore - Block broadcast_506_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
20:14:30.067 INFO BlockManagerInfo - Added broadcast_506_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.5 MiB)
20:14:30.067 INFO SparkContext - Created broadcast 506 from broadcast at ReadsSparkSink.java:133
20:14:30.068 INFO MemoryStore - Block broadcast_507 stored as values in memory (estimated size 6.2 KiB, free 1917.7 MiB)
20:14:30.068 INFO MemoryStore - Block broadcast_507_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
20:14:30.069 INFO BlockManagerInfo - Added broadcast_507_piece0 in memory on localhost:35739 (size: 1473.0 B, free: 1919.5 MiB)
20:14:30.069 INFO SparkContext - Created broadcast 507 from broadcast at CramSink.java:76
20:14:30.071 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts dst=null perm=null proto=rpc
20:14:30.071 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:30.071 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:30.071 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:30.072 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:30.079 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:30.079 INFO DAGScheduler - Registering RDD 1203 (mapToPair at SparkUtils.java:161) as input to shuffle 50
20:14:30.079 INFO DAGScheduler - Got job 189 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:30.079 INFO DAGScheduler - Final stage: ResultStage 251 (runJob at SparkHadoopWriter.scala:83)
20:14:30.079 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 250)
20:14:30.079 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 250)
20:14:30.079 INFO DAGScheduler - Submitting ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:30.099 INFO MemoryStore - Block broadcast_508 stored as values in memory (estimated size 292.8 KiB, free 1917.4 MiB)
20:14:30.100 INFO MemoryStore - Block broadcast_508_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.3 MiB)
20:14:30.101 INFO BlockManagerInfo - Added broadcast_508_piece0 in memory on localhost:35739 (size: 107.3 KiB, free: 1919.4 MiB)
20:14:30.101 INFO SparkContext - Created broadcast 508 from broadcast at DAGScheduler.scala:1580
20:14:30.101 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:30.101 INFO TaskSchedulerImpl - Adding task set 250.0 with 1 tasks resource profile 0
20:14:30.101 INFO TaskSetManager - Starting task 0.0 in stage 250.0 (TID 306) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
20:14:30.102 INFO Executor - Running task 0.0 in stage 250.0 (TID 306)
20:14:30.124 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:30.138 INFO Executor - Finished task 0.0 in stage 250.0 (TID 306). 1148 bytes result sent to driver
20:14:30.139 INFO TaskSetManager - Finished task 0.0 in stage 250.0 (TID 306) in 38 ms on localhost (executor driver) (1/1)
20:14:30.139 INFO TaskSchedulerImpl - Removed TaskSet 250.0, whose tasks have all completed, from pool
20:14:30.139 INFO DAGScheduler - ShuffleMapStage 250 (mapToPair at SparkUtils.java:161) finished in 0.059 s
20:14:30.139 INFO DAGScheduler - looking for newly runnable stages
20:14:30.139 INFO DAGScheduler - running: HashSet()
20:14:30.139 INFO DAGScheduler - waiting: HashSet(ResultStage 251)
20:14:30.139 INFO DAGScheduler - failed: HashSet()
20:14:30.139 INFO DAGScheduler - Submitting ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89), which has no missing parents
20:14:30.146 INFO MemoryStore - Block broadcast_509 stored as values in memory (estimated size 153.3 KiB, free 1917.1 MiB)
20:14:30.146 INFO MemoryStore - Block broadcast_509_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1917.1 MiB)
20:14:30.146 INFO BlockManagerInfo - Added broadcast_509_piece0 in memory on localhost:35739 (size: 58.1 KiB, free: 1919.4 MiB)
20:14:30.147 INFO SparkContext - Created broadcast 509 from broadcast at DAGScheduler.scala:1580
20:14:30.147 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
20:14:30.147 INFO TaskSchedulerImpl - Adding task set 251.0 with 1 tasks resource profile 0
20:14:30.147 INFO TaskSetManager - Starting task 0.0 in stage 251.0 (TID 307) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:30.148 INFO Executor - Running task 0.0 in stage 251.0 (TID 307)
20:14:30.151 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:30.151 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:30.157 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:30.157 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:30.157 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:30.157 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:30.158 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:30.158 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:30.158 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/_temporary/attempt_20250210201430194937815244516838_1208_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:30.182 INFO StateChange - BLOCK* allocate blk_1073741912_1088, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/_temporary/attempt_20250210201430194937815244516838_1208_r_000000_0/part-r-00000
20:14:30.183 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741912_1088 src: /127.0.0.1:45366 dest: /127.0.0.1:38353
20:14:30.184 INFO clienttrace - src: /127.0.0.1:45366, dest: /127.0.0.1:38353, bytes: 42661, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741912_1088, duration(ns): 511566
20:14:30.184 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741912_1088, type=LAST_IN_PIPELINE terminating
20:14:30.185 INFO FSNamesystem - BLOCK* blk_1073741912_1088 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/_temporary/attempt_20250210201430194937815244516838_1208_r_000000_0/part-r-00000
20:14:30.586 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/_temporary/attempt_20250210201430194937815244516838_1208_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:30.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/_temporary/attempt_20250210201430194937815244516838_1208_r_000000_0 dst=null perm=null proto=rpc
20:14:30.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/_temporary/attempt_20250210201430194937815244516838_1208_r_000000_0 dst=null perm=null proto=rpc
20:14:30.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/task_20250210201430194937815244516838_1208_r_000000 dst=null perm=null proto=rpc
20:14:30.588 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/_temporary/attempt_20250210201430194937815244516838_1208_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/task_20250210201430194937815244516838_1208_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:30.588 INFO FileOutputCommitter - Saved output of task 'attempt_20250210201430194937815244516838_1208_r_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/task_20250210201430194937815244516838_1208_r_000000
20:14:30.588 INFO SparkHadoopMapRedUtil - attempt_20250210201430194937815244516838_1208_r_000000_0: Committed. Elapsed time: 1 ms.
20:14:30.589 INFO Executor - Finished task 0.0 in stage 251.0 (TID 307). 1858 bytes result sent to driver
20:14:30.589 INFO TaskSetManager - Finished task 0.0 in stage 251.0 (TID 307) in 442 ms on localhost (executor driver) (1/1)
20:14:30.589 INFO TaskSchedulerImpl - Removed TaskSet 251.0, whose tasks have all completed, from pool
20:14:30.589 INFO DAGScheduler - ResultStage 251 (runJob at SparkHadoopWriter.scala:83) finished in 0.449 s
20:14:30.589 INFO DAGScheduler - Job 189 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:30.589 INFO TaskSchedulerImpl - Killing all running tasks in stage 251: Stage finished
20:14:30.589 INFO DAGScheduler - Job 189 finished: runJob at SparkHadoopWriter.scala:83, took 0.510690 s
20:14:30.590 INFO SparkHadoopWriter - Start to commit write Job job_20250210201430194937815244516838_1208.
20:14:30.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:30.590 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts dst=null perm=null proto=rpc
20:14:30.591 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/task_20250210201430194937815244516838_1208_r_000000 dst=null perm=null proto=rpc
20:14:30.591 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/part-r-00000 dst=null perm=null proto=rpc
20:14:30.592 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary/0/task_20250210201430194937815244516838_1208_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:30.592 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_temporary dst=null perm=null proto=rpc
20:14:30.593 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:30.593 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:30.594 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/.spark-staging-1208 dst=null perm=null proto=rpc
20:14:30.594 INFO SparkHadoopWriter - Write Job job_20250210201430194937815244516838_1208 committed. Elapsed time: 4 ms.
20:14:30.594 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:30.599 INFO BlockManagerInfo - Removed broadcast_501_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.4 MiB)
20:14:30.599 INFO BlockManagerInfo - Removed broadcast_505_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:30.599 INFO StateChange - BLOCK* allocate blk_1073741913_1089, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/header
20:14:30.600 INFO BlockManagerInfo - Removed broadcast_508_piece0 on localhost:35739 in memory (size: 107.3 KiB, free: 1919.6 MiB)
20:14:30.600 INFO BlockManagerInfo - Removed broadcast_498_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:30.600 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741913_1089 src: /127.0.0.1:45380 dest: /127.0.0.1:38353
20:14:30.601 INFO BlockManagerInfo - Removed broadcast_491_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:30.601 INFO BlockManagerInfo - Removed broadcast_500_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.8 MiB)
20:14:30.602 INFO BlockManagerInfo - Removed broadcast_509_piece0 on localhost:35739 in memory (size: 58.1 KiB, free: 1919.9 MiB)
20:14:30.602 INFO clienttrace - src: /127.0.0.1:45380, dest: /127.0.0.1:38353, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741913_1089, duration(ns): 656534
20:14:30.602 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741913_1089, type=LAST_IN_PIPELINE terminating
20:14:30.602 INFO BlockManagerInfo - Removed broadcast_504_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.9 MiB)
20:14:30.603 INFO FSNamesystem - BLOCK* blk_1073741913_1089 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/header
20:14:30.603 INFO BlockManagerInfo - Removed broadcast_499_piece0 on localhost:35739 in memory (size: 54.6 KiB, free: 1919.9 MiB)
20:14:30.603 INFO BlockManagerInfo - Removed broadcast_497_piece0 on localhost:35739 in memory (size: 231.0 B, free: 1919.9 MiB)
20:14:31.003 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:31.004 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:31.005 INFO StateChange - BLOCK* allocate blk_1073741914_1090, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/terminator
20:14:31.005 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741914_1090 src: /127.0.0.1:45390 dest: /127.0.0.1:38353
20:14:31.006 INFO clienttrace - src: /127.0.0.1:45390, dest: /127.0.0.1:38353, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741914_1090, duration(ns): 351668
20:14:31.006 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741914_1090, type=LAST_IN_PIPELINE terminating
20:14:31.007 INFO FSNamesystem - BLOCK* blk_1073741914_1090 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/terminator
20:14:31.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741906_1082 replica FinalizedReplica, blk_1073741906_1082, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data2
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741906 for deletion
20:14:31.049 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741907_1083 replica FinalizedReplica, blk_1073741907_1083, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage10361427482595794971/data/data1
getBlockURI() = file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741907 for deletion
20:14:31.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741906_1082 URI file:/tmp/minicluster_storage10361427482595794971/data/data2/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741906
20:14:31.050 INFO FsDatasetAsyncDiskService - Deleted BP-488470852-10.1.0.79-1739218421831 blk_1073741907_1083 URI file:/tmp/minicluster_storage10361427482595794971/data/data1/current/BP-488470852-10.1.0.79-1739218421831/current/finalized/subdir0/subdir0/blk_1073741907
20:14:31.408 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:31.408 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts dst=null perm=null proto=rpc
20:14:31.409 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:31.410 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:31.410 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram
20:14:31.410 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:31.410 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.411 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.411 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:31.411 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram done
20:14:31.412 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.parts dst=null perm=null proto=rpc
20:14:31.412 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.412 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.413 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.413 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.414 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.crai dst=null perm=null proto=rpc
20:14:31.414 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.crai dst=null perm=null proto=rpc
20:14:31.416 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:31.416 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:31.416 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:31.417 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.417 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.418 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.crai dst=null perm=null proto=rpc
20:14:31.418 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.crai dst=null perm=null proto=rpc
20:14:31.418 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.419 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.419 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:31.420 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:31.420 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:31.421 INFO MemoryStore - Block broadcast_510 stored as values in memory (estimated size 528.0 B, free 1919.6 MiB)
20:14:31.421 INFO MemoryStore - Block broadcast_510_piece0 stored as bytes in memory (estimated size 187.0 B, free 1919.6 MiB)
20:14:31.421 INFO BlockManagerInfo - Added broadcast_510_piece0 in memory on localhost:35739 (size: 187.0 B, free: 1919.9 MiB)
20:14:31.421 INFO SparkContext - Created broadcast 510 from broadcast at CramSource.java:114
20:14:31.423 INFO MemoryStore - Block broadcast_511 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
20:14:31.432 INFO MemoryStore - Block broadcast_511_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
20:14:31.432 INFO BlockManagerInfo - Added broadcast_511_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.9 MiB)
20:14:31.432 INFO SparkContext - Created broadcast 511 from newAPIHadoopFile at PathSplitSource.java:96
20:14:31.456 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.456 INFO FileInputFormat - Total input files to process : 1
20:14:31.457 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.488 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:31.488 INFO DAGScheduler - Got job 190 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:31.488 INFO DAGScheduler - Final stage: ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:31.488 INFO DAGScheduler - Parents of final stage: List()
20:14:31.488 INFO DAGScheduler - Missing parents: List()
20:14:31.489 INFO DAGScheduler - Submitting ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:31.500 INFO MemoryStore - Block broadcast_512 stored as values in memory (estimated size 286.8 KiB, free 1919.0 MiB)
20:14:31.501 INFO MemoryStore - Block broadcast_512_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.9 MiB)
20:14:31.501 INFO BlockManagerInfo - Added broadcast_512_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.8 MiB)
20:14:31.501 INFO SparkContext - Created broadcast 512 from broadcast at DAGScheduler.scala:1580
20:14:31.501 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:31.501 INFO TaskSchedulerImpl - Adding task set 252.0 with 1 tasks resource profile 0
20:14:31.502 INFO TaskSetManager - Starting task 0.0 in stage 252.0 (TID 308) (localhost, executor driver, partition 0, ANY, 7853 bytes)
20:14:31.502 INFO Executor - Running task 0.0 in stage 252.0 (TID 308)
20:14:31.522 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram:0+43715
20:14:31.523 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.523 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.524 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.crai dst=null perm=null proto=rpc
20:14:31.525 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.crai dst=null perm=null proto=rpc
20:14:31.527 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:31.527 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:31.527 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:31.542 INFO Executor - Finished task 0.0 in stage 252.0 (TID 308). 154058 bytes result sent to driver
20:14:31.543 INFO TaskSetManager - Finished task 0.0 in stage 252.0 (TID 308) in 41 ms on localhost (executor driver) (1/1)
20:14:31.543 INFO TaskSchedulerImpl - Removed TaskSet 252.0, whose tasks have all completed, from pool
20:14:31.543 INFO DAGScheduler - ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.054 s
20:14:31.543 INFO DAGScheduler - Job 190 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:31.543 INFO TaskSchedulerImpl - Killing all running tasks in stage 252: Stage finished
20:14:31.543 INFO DAGScheduler - Job 190 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.055276 s
20:14:31.548 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:31.549 INFO DAGScheduler - Got job 191 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:31.549 INFO DAGScheduler - Final stage: ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185)
20:14:31.549 INFO DAGScheduler - Parents of final stage: List()
20:14:31.549 INFO DAGScheduler - Missing parents: List()
20:14:31.549 INFO DAGScheduler - Submitting ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:31.560 INFO MemoryStore - Block broadcast_513 stored as values in memory (estimated size 286.8 KiB, free 1918.6 MiB)
20:14:31.561 INFO MemoryStore - Block broadcast_513_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.5 MiB)
20:14:31.561 INFO BlockManagerInfo - Added broadcast_513_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.7 MiB)
20:14:31.561 INFO SparkContext - Created broadcast 513 from broadcast at DAGScheduler.scala:1580
20:14:31.562 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:31.562 INFO TaskSchedulerImpl - Adding task set 253.0 with 1 tasks resource profile 0
20:14:31.562 INFO TaskSetManager - Starting task 0.0 in stage 253.0 (TID 309) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
20:14:31.562 INFO Executor - Running task 0.0 in stage 253.0 (TID 309)
20:14:31.588 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
20:14:31.593 INFO Executor - Finished task 0.0 in stage 253.0 (TID 309). 989 bytes result sent to driver
20:14:31.594 INFO TaskSetManager - Finished task 0.0 in stage 253.0 (TID 309) in 32 ms on localhost (executor driver) (1/1)
20:14:31.594 INFO TaskSchedulerImpl - Removed TaskSet 253.0, whose tasks have all completed, from pool
20:14:31.594 INFO DAGScheduler - ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.045 s
20:14:31.594 INFO DAGScheduler - Job 191 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:31.594 INFO TaskSchedulerImpl - Killing all running tasks in stage 253: Stage finished
20:14:31.594 INFO DAGScheduler - Job 191 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.045508 s
20:14:31.597 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:31.597 INFO DAGScheduler - Got job 192 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:31.597 INFO DAGScheduler - Final stage: ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185)
20:14:31.597 INFO DAGScheduler - Parents of final stage: List()
20:14:31.597 INFO DAGScheduler - Missing parents: List()
20:14:31.598 INFO DAGScheduler - Submitting ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:31.613 INFO MemoryStore - Block broadcast_514 stored as values in memory (estimated size 286.8 KiB, free 1918.3 MiB)
20:14:31.614 INFO MemoryStore - Block broadcast_514_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.2 MiB)
20:14:31.614 INFO BlockManagerInfo - Added broadcast_514_piece0 in memory on localhost:35739 (size: 103.6 KiB, free: 1919.6 MiB)
20:14:31.614 INFO SparkContext - Created broadcast 514 from broadcast at DAGScheduler.scala:1580
20:14:31.615 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:31.615 INFO TaskSchedulerImpl - Adding task set 254.0 with 1 tasks resource profile 0
20:14:31.615 INFO TaskSetManager - Starting task 0.0 in stage 254.0 (TID 310) (localhost, executor driver, partition 0, ANY, 7853 bytes)
20:14:31.615 INFO Executor - Running task 0.0 in stage 254.0 (TID 310)
20:14:31.635 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram:0+43715
20:14:31.636 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.636 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram dst=null perm=null proto=rpc
20:14:31.637 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.cram.crai dst=null perm=null proto=rpc
20:14:31.637 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_88115ced-8fa6-4fe8-b949-449cd9a5dc05.crai dst=null perm=null proto=rpc
20:14:31.639 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
20:14:31.639 WARN DFSUtil - Unexpected value for data transfer bytes=42997 duration=0
20:14:31.640 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
20:14:31.649 INFO Executor - Finished task 0.0 in stage 254.0 (TID 310). 989 bytes result sent to driver
20:14:31.649 INFO TaskSetManager - Finished task 0.0 in stage 254.0 (TID 310) in 34 ms on localhost (executor driver) (1/1)
20:14:31.649 INFO TaskSchedulerImpl - Removed TaskSet 254.0, whose tasks have all completed, from pool
20:14:31.649 INFO DAGScheduler - ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.051 s
20:14:31.649 INFO DAGScheduler - Job 192 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:31.649 INFO TaskSchedulerImpl - Killing all running tasks in stage 254: Stage finished
20:14:31.650 INFO DAGScheduler - Job 192 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.052388 s
20:14:31.662 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:31.663 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:31.664 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:31.664 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:31.666 INFO MemoryStore - Block broadcast_515 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
20:14:31.673 INFO MemoryStore - Block broadcast_515_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
20:14:31.673 INFO BlockManagerInfo - Added broadcast_515_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:31.673 INFO SparkContext - Created broadcast 515 from newAPIHadoopFile at PathSplitSource.java:96
20:14:31.694 INFO MemoryStore - Block broadcast_516 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
20:14:31.701 INFO MemoryStore - Block broadcast_516_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
20:14:31.701 INFO BlockManagerInfo - Added broadcast_516_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.5 MiB)
20:14:31.701 INFO SparkContext - Created broadcast 516 from newAPIHadoopFile at PathSplitSource.java:96
20:14:31.721 INFO FileInputFormat - Total input files to process : 1
20:14:31.722 INFO MemoryStore - Block broadcast_517 stored as values in memory (estimated size 160.7 KiB, free 1917.3 MiB)
20:14:31.723 INFO MemoryStore - Block broadcast_517_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
20:14:31.723 INFO BlockManagerInfo - Added broadcast_517_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.5 MiB)
20:14:31.723 INFO SparkContext - Created broadcast 517 from broadcast at ReadsSparkSink.java:133
20:14:31.727 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts dst=null perm=null proto=rpc
20:14:31.727 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:31.727 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:31.727 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:31.728 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:31.734 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:31.734 INFO DAGScheduler - Registering RDD 1228 (mapToPair at SparkUtils.java:161) as input to shuffle 51
20:14:31.735 INFO DAGScheduler - Got job 193 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:31.735 INFO DAGScheduler - Final stage: ResultStage 256 (runJob at SparkHadoopWriter.scala:83)
20:14:31.735 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 255)
20:14:31.735 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 255)
20:14:31.735 INFO DAGScheduler - Submitting ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:31.753 INFO MemoryStore - Block broadcast_518 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
20:14:31.754 INFO MemoryStore - Block broadcast_518_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.6 MiB)
20:14:31.755 INFO BlockManagerInfo - Added broadcast_518_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.3 MiB)
20:14:31.755 INFO SparkContext - Created broadcast 518 from broadcast at DAGScheduler.scala:1580
20:14:31.755 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:31.755 INFO TaskSchedulerImpl - Adding task set 255.0 with 1 tasks resource profile 0
20:14:31.755 INFO TaskSetManager - Starting task 0.0 in stage 255.0 (TID 311) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:31.756 INFO Executor - Running task 0.0 in stage 255.0 (TID 311)
20:14:31.785 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:31.799 INFO Executor - Finished task 0.0 in stage 255.0 (TID 311). 1148 bytes result sent to driver
20:14:31.799 INFO TaskSetManager - Finished task 0.0 in stage 255.0 (TID 311) in 44 ms on localhost (executor driver) (1/1)
20:14:31.799 INFO TaskSchedulerImpl - Removed TaskSet 255.0, whose tasks have all completed, from pool
20:14:31.800 INFO DAGScheduler - ShuffleMapStage 255 (mapToPair at SparkUtils.java:161) finished in 0.064 s
20:14:31.800 INFO DAGScheduler - looking for newly runnable stages
20:14:31.800 INFO DAGScheduler - running: HashSet()
20:14:31.800 INFO DAGScheduler - waiting: HashSet(ResultStage 256)
20:14:31.800 INFO DAGScheduler - failed: HashSet()
20:14:31.800 INFO DAGScheduler - Submitting ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65), which has no missing parents
20:14:31.806 INFO MemoryStore - Block broadcast_519 stored as values in memory (estimated size 241.1 KiB, free 1916.4 MiB)
20:14:31.807 INFO MemoryStore - Block broadcast_519_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.3 MiB)
20:14:31.807 INFO BlockManagerInfo - Added broadcast_519_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.3 MiB)
20:14:31.807 INFO SparkContext - Created broadcast 519 from broadcast at DAGScheduler.scala:1580
20:14:31.808 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
20:14:31.808 INFO TaskSchedulerImpl - Adding task set 256.0 with 1 tasks resource profile 0
20:14:31.808 INFO TaskSetManager - Starting task 0.0 in stage 256.0 (TID 312) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:31.808 INFO Executor - Running task 0.0 in stage 256.0 (TID 312)
20:14:31.812 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:31.812 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:31.823 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
20:14:31.823 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:31.823 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:31.824 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/_temporary/attempt_20250210201431945685293742120033_1234_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:31.826 INFO StateChange - BLOCK* allocate blk_1073741915_1091, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/_temporary/attempt_20250210201431945685293742120033_1234_m_000000_0/part-00000
20:14:31.827 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741915_1091 src: /127.0.0.1:45392 dest: /127.0.0.1:38353
20:14:31.834 INFO clienttrace - src: /127.0.0.1:45392, dest: /127.0.0.1:38353, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741915_1091, duration(ns): 6387670
20:14:31.834 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741915_1091, type=LAST_IN_PIPELINE terminating
20:14:31.835 INFO FSNamesystem - BLOCK* blk_1073741915_1091 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/_temporary/attempt_20250210201431945685293742120033_1234_m_000000_0/part-00000
20:14:32.235 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/_temporary/attempt_20250210201431945685293742120033_1234_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:32.236 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/_temporary/attempt_20250210201431945685293742120033_1234_m_000000_0 dst=null perm=null proto=rpc
20:14:32.237 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/_temporary/attempt_20250210201431945685293742120033_1234_m_000000_0 dst=null perm=null proto=rpc
20:14:32.237 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/task_20250210201431945685293742120033_1234_m_000000 dst=null perm=null proto=rpc
20:14:32.237 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/_temporary/attempt_20250210201431945685293742120033_1234_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/task_20250210201431945685293742120033_1234_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
20:14:32.238 INFO FileOutputCommitter - Saved output of task 'attempt_20250210201431945685293742120033_1234_m_000000_0' to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/task_20250210201431945685293742120033_1234_m_000000
20:14:32.238 INFO SparkHadoopMapRedUtil - attempt_20250210201431945685293742120033_1234_m_000000_0: Committed. Elapsed time: 1 ms.
20:14:32.238 INFO Executor - Finished task 0.0 in stage 256.0 (TID 312). 1858 bytes result sent to driver
20:14:32.238 INFO TaskSetManager - Finished task 0.0 in stage 256.0 (TID 312) in 430 ms on localhost (executor driver) (1/1)
20:14:32.238 INFO TaskSchedulerImpl - Removed TaskSet 256.0, whose tasks have all completed, from pool
20:14:32.239 INFO DAGScheduler - ResultStage 256 (runJob at SparkHadoopWriter.scala:83) finished in 0.439 s
20:14:32.239 INFO DAGScheduler - Job 193 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:32.239 INFO TaskSchedulerImpl - Killing all running tasks in stage 256: Stage finished
20:14:32.239 INFO DAGScheduler - Job 193 finished: runJob at SparkHadoopWriter.scala:83, took 0.504791 s
20:14:32.239 INFO SparkHadoopWriter - Start to commit write Job job_20250210201431945685293742120033_1234.
20:14:32.240 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0 dst=null perm=null proto=rpc
20:14:32.240 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts dst=null perm=null proto=rpc
20:14:32.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/task_20250210201431945685293742120033_1234_m_000000 dst=null perm=null proto=rpc
20:14:32.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/part-00000 dst=null perm=null proto=rpc
20:14:32.241 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary/0/task_20250210201431945685293742120033_1234_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:32.242 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_temporary dst=null perm=null proto=rpc
20:14:32.243 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:32.243 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:32.244 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/.spark-staging-1234 dst=null perm=null proto=rpc
20:14:32.244 INFO SparkHadoopWriter - Write Job job_20250210201431945685293742120033_1234 committed. Elapsed time: 4 ms.
20:14:32.244 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:32.245 INFO StateChange - BLOCK* allocate blk_1073741916_1092, replicas=127.0.0.1:38353 for /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/header
20:14:32.246 INFO DataNode - Receiving BP-488470852-10.1.0.79-1739218421831:blk_1073741916_1092 src: /127.0.0.1:45402 dest: /127.0.0.1:38353
20:14:32.247 INFO clienttrace - src: /127.0.0.1:45402, dest: /127.0.0.1:38353, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-679411174_1, offset: 0, srvID: e05b3ae3-c8c8-405b-9911-7c0919b02d43, blockid: BP-488470852-10.1.0.79-1739218421831:blk_1073741916_1092, duration(ns): 534055
20:14:32.247 INFO DataNode - PacketResponder: BP-488470852-10.1.0.79-1739218421831:blk_1073741916_1092, type=LAST_IN_PIPELINE terminating
20:14:32.248 INFO FSNamesystem - BLOCK* blk_1073741916_1092 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/header
20:14:32.648 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:32.649 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts dst=null perm=null proto=rpc
20:14:32.650 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:32.650 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-679411174_1
20:14:32.650 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam
20:14:32.651 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:32.651 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.651 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.652 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam perm=runner:supergroup:rw-r--r-- proto=rpc
20:14:32.652 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam done
20:14:32.652 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam.parts dst=null perm=null proto=rpc
20:14:32.653 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.653 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.653 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.654 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
WARNING 2025-02-10 20:14:32 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:32.655 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
20:14:32.657 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.657 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.657 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
WARNING 2025-02-10 20:14:32 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:14:32.658 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
20:14:32.660 INFO MemoryStore - Block broadcast_520 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
20:14:32.661 INFO MemoryStore - Block broadcast_520_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
20:14:32.661 INFO BlockManagerInfo - Added broadcast_520_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.3 MiB)
20:14:32.661 INFO SparkContext - Created broadcast 520 from broadcast at SamSource.java:78
20:14:32.662 INFO MemoryStore - Block broadcast_521 stored as values in memory (estimated size 297.9 KiB, free 1915.9 MiB)
20:14:32.668 INFO MemoryStore - Block broadcast_521_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.8 MiB)
20:14:32.668 INFO BlockManagerInfo - Added broadcast_521_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.2 MiB)
20:14:32.668 INFO SparkContext - Created broadcast 521 from newAPIHadoopFile at SamSource.java:108
20:14:32.671 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.671 INFO FileInputFormat - Total input files to process : 1
20:14:32.671 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.675 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:32.675 INFO DAGScheduler - Got job 194 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:32.675 INFO DAGScheduler - Final stage: ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:32.675 INFO DAGScheduler - Parents of final stage: List()
20:14:32.675 INFO DAGScheduler - Missing parents: List()
20:14:32.675 INFO DAGScheduler - Submitting ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:32.676 INFO MemoryStore - Block broadcast_522 stored as values in memory (estimated size 7.5 KiB, free 1915.8 MiB)
20:14:32.680 INFO MemoryStore - Block broadcast_522_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1915.8 MiB)
20:14:32.680 INFO BlockManagerInfo - Added broadcast_522_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.2 MiB)
20:14:32.680 INFO BlockManagerInfo - Removed broadcast_512_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.3 MiB)
20:14:32.680 INFO SparkContext - Created broadcast 522 from broadcast at DAGScheduler.scala:1580
20:14:32.681 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:32.681 INFO TaskSchedulerImpl - Adding task set 257.0 with 1 tasks resource profile 0
20:14:32.681 INFO BlockManagerInfo - Removed broadcast_510_piece0 on localhost:35739 in memory (size: 187.0 B, free: 1919.3 MiB)
20:14:32.681 INFO TaskSetManager - Starting task 0.0 in stage 257.0 (TID 313) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:32.681 INFO BlockManagerInfo - Removed broadcast_506_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.3 MiB)
20:14:32.681 INFO Executor - Running task 0.0 in stage 257.0 (TID 313)
20:14:32.682 INFO BlockManagerInfo - Removed broadcast_519_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.4 MiB)
20:14:32.682 INFO BlockManagerInfo - Removed broadcast_518_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.5 MiB)
20:14:32.683 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam:0+847558
20:14:32.684 INFO BlockManagerInfo - Removed broadcast_514_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.6 MiB)
20:14:32.685 INFO BlockManagerInfo - Removed broadcast_516_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.7 MiB)
20:14:32.685 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.685 INFO BlockManagerInfo - Removed broadcast_517_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.7 MiB)
20:14:32.686 INFO BlockManagerInfo - Removed broadcast_513_piece0 on localhost:35739 in memory (size: 103.6 KiB, free: 1919.8 MiB)
20:14:32.686 INFO BlockManagerInfo - Removed broadcast_503_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.8 MiB)
20:14:32.687 INFO BlockManagerInfo - Removed broadcast_507_piece0 on localhost:35739 in memory (size: 1473.0 B, free: 1919.8 MiB)
20:14:32.687 INFO BlockManagerInfo - Removed broadcast_502_piece0 on localhost:35739 in memory (size: 228.0 B, free: 1919.8 MiB)
20:14:32.688 INFO BlockManagerInfo - Removed broadcast_511_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.9 MiB)
20:14:32.703 INFO Executor - Finished task 0.0 in stage 257.0 (TID 313). 651526 bytes result sent to driver
20:14:32.705 INFO TaskSetManager - Finished task 0.0 in stage 257.0 (TID 313) in 24 ms on localhost (executor driver) (1/1)
20:14:32.705 INFO TaskSchedulerImpl - Removed TaskSet 257.0, whose tasks have all completed, from pool
20:14:32.705 INFO DAGScheduler - ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
20:14:32.705 INFO DAGScheduler - Job 194 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:32.705 INFO TaskSchedulerImpl - Killing all running tasks in stage 257: Stage finished
20:14:32.705 INFO DAGScheduler - Job 194 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029939 s
20:14:32.720 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:32.721 INFO DAGScheduler - Got job 195 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:32.721 INFO DAGScheduler - Final stage: ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185)
20:14:32.721 INFO DAGScheduler - Parents of final stage: List()
20:14:32.721 INFO DAGScheduler - Missing parents: List()
20:14:32.721 INFO DAGScheduler - Submitting ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:32.737 INFO MemoryStore - Block broadcast_523 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
20:14:32.739 INFO MemoryStore - Block broadcast_523_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
20:14:32.739 INFO BlockManagerInfo - Added broadcast_523_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.7 MiB)
20:14:32.739 INFO SparkContext - Created broadcast 523 from broadcast at DAGScheduler.scala:1580
20:14:32.739 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:32.739 INFO TaskSchedulerImpl - Adding task set 258.0 with 1 tasks resource profile 0
20:14:32.739 INFO TaskSetManager - Starting task 0.0 in stage 258.0 (TID 314) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:32.740 INFO Executor - Running task 0.0 in stage 258.0 (TID 314)
20:14:32.769 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:32.779 INFO Executor - Finished task 0.0 in stage 258.0 (TID 314). 989 bytes result sent to driver
20:14:32.779 INFO TaskSetManager - Finished task 0.0 in stage 258.0 (TID 314) in 40 ms on localhost (executor driver) (1/1)
20:14:32.779 INFO TaskSchedulerImpl - Removed TaskSet 258.0, whose tasks have all completed, from pool
20:14:32.779 INFO DAGScheduler - ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
20:14:32.779 INFO DAGScheduler - Job 195 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:32.779 INFO TaskSchedulerImpl - Killing all running tasks in stage 258: Stage finished
20:14:32.779 INFO DAGScheduler - Job 195 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.058924 s
20:14:32.783 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:32.783 INFO DAGScheduler - Got job 196 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:32.783 INFO DAGScheduler - Final stage: ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185)
20:14:32.783 INFO DAGScheduler - Parents of final stage: List()
20:14:32.783 INFO DAGScheduler - Missing parents: List()
20:14:32.783 INFO DAGScheduler - Submitting ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:32.784 INFO MemoryStore - Block broadcast_524 stored as values in memory (estimated size 7.4 KiB, free 1918.6 MiB)
20:14:32.784 INFO MemoryStore - Block broadcast_524_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.6 MiB)
20:14:32.784 INFO BlockManagerInfo - Added broadcast_524_piece0 in memory on localhost:35739 (size: 3.8 KiB, free: 1919.7 MiB)
20:14:32.784 INFO SparkContext - Created broadcast 524 from broadcast at DAGScheduler.scala:1580
20:14:32.784 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:32.784 INFO TaskSchedulerImpl - Adding task set 259.0 with 1 tasks resource profile 0
20:14:32.785 INFO TaskSetManager - Starting task 0.0 in stage 259.0 (TID 315) (localhost, executor driver, partition 0, ANY, 7852 bytes)
20:14:32.785 INFO Executor - Running task 0.0 in stage 259.0 (TID 315)
20:14:32.786 INFO NewHadoopRDD - Input split: hdfs://localhost:40199/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam:0+847558
20:14:32.787 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e2ed3377-f994-4fd3-82b0-c1665197c39e.sam dst=null perm=null proto=rpc
20:14:32.788 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
20:14:32.794 INFO Executor - Finished task 0.0 in stage 259.0 (TID 315). 946 bytes result sent to driver
20:14:32.794 INFO TaskSetManager - Finished task 0.0 in stage 259.0 (TID 315) in 9 ms on localhost (executor driver) (1/1)
20:14:32.794 INFO TaskSchedulerImpl - Removed TaskSet 259.0, whose tasks have all completed, from pool
20:14:32.794 INFO DAGScheduler - ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.011 s
20:14:32.794 INFO DAGScheduler - Job 196 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:32.794 INFO TaskSchedulerImpl - Killing all running tasks in stage 259: Stage finished
20:14:32.794 INFO DAGScheduler - Job 196 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.011598 s
20:14:32.797 INFO MemoryStore - Block broadcast_525 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
20:14:32.803 INFO MemoryStore - Block broadcast_525_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
20:14:32.803 INFO BlockManagerInfo - Added broadcast_525_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.7 MiB)
20:14:32.803 INFO SparkContext - Created broadcast 525 from newAPIHadoopFile at PathSplitSource.java:96
20:14:32.825 INFO MemoryStore - Block broadcast_526 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
20:14:32.831 INFO MemoryStore - Block broadcast_526_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
20:14:32.831 INFO BlockManagerInfo - Added broadcast_526_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.6 MiB)
20:14:32.831 INFO SparkContext - Created broadcast 526 from newAPIHadoopFile at PathSplitSource.java:96
20:14:32.851 INFO FileInputFormat - Total input files to process : 1
20:14:32.852 INFO MemoryStore - Block broadcast_527 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
20:14:32.853 INFO MemoryStore - Block broadcast_527_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
20:14:32.853 INFO BlockManagerInfo - Added broadcast_527_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:32.853 INFO SparkContext - Created broadcast 527 from broadcast at ReadsSparkSink.java:133
20:14:32.855 INFO MemoryStore - Block broadcast_528 stored as values in memory (estimated size 163.2 KiB, free 1917.6 MiB)
20:14:32.855 INFO MemoryStore - Block broadcast_528_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.6 MiB)
20:14:32.855 INFO BlockManagerInfo - Added broadcast_528_piece0 in memory on localhost:35739 (size: 9.6 KiB, free: 1919.6 MiB)
20:14:32.855 INFO SparkContext - Created broadcast 528 from broadcast at BamSink.java:76
20:14:32.857 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:32.857 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:32.857 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:32.874 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
20:14:32.874 INFO DAGScheduler - Registering RDD 1253 (mapToPair at SparkUtils.java:161) as input to shuffle 52
20:14:32.874 INFO DAGScheduler - Got job 197 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
20:14:32.874 INFO DAGScheduler - Final stage: ResultStage 261 (runJob at SparkHadoopWriter.scala:83)
20:14:32.874 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 260)
20:14:32.874 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 260)
20:14:32.874 INFO DAGScheduler - Submitting ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161), which has no missing parents
20:14:32.891 INFO MemoryStore - Block broadcast_529 stored as values in memory (estimated size 520.4 KiB, free 1917.0 MiB)
20:14:32.893 INFO MemoryStore - Block broadcast_529_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.9 MiB)
20:14:32.893 INFO BlockManagerInfo - Added broadcast_529_piece0 in memory on localhost:35739 (size: 166.1 KiB, free: 1919.5 MiB)
20:14:32.893 INFO SparkContext - Created broadcast 529 from broadcast at DAGScheduler.scala:1580
20:14:32.893 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
20:14:32.893 INFO TaskSchedulerImpl - Adding task set 260.0 with 1 tasks resource profile 0
20:14:32.894 INFO TaskSetManager - Starting task 0.0 in stage 260.0 (TID 316) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
20:14:32.894 INFO Executor - Running task 0.0 in stage 260.0 (TID 316)
20:14:32.924 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:32.938 INFO Executor - Finished task 0.0 in stage 260.0 (TID 316). 1148 bytes result sent to driver
20:14:32.938 INFO TaskSetManager - Finished task 0.0 in stage 260.0 (TID 316) in 45 ms on localhost (executor driver) (1/1)
20:14:32.938 INFO TaskSchedulerImpl - Removed TaskSet 260.0, whose tasks have all completed, from pool
20:14:32.938 INFO DAGScheduler - ShuffleMapStage 260 (mapToPair at SparkUtils.java:161) finished in 0.063 s
20:14:32.939 INFO DAGScheduler - looking for newly runnable stages
20:14:32.939 INFO DAGScheduler - running: HashSet()
20:14:32.939 INFO DAGScheduler - waiting: HashSet(ResultStage 261)
20:14:32.939 INFO DAGScheduler - failed: HashSet()
20:14:32.939 INFO DAGScheduler - Submitting ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91), which has no missing parents
20:14:32.945 INFO MemoryStore - Block broadcast_530 stored as values in memory (estimated size 241.4 KiB, free 1916.6 MiB)
20:14:32.946 INFO MemoryStore - Block broadcast_530_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.6 MiB)
20:14:32.946 INFO BlockManagerInfo - Added broadcast_530_piece0 in memory on localhost:35739 (size: 67.0 KiB, free: 1919.4 MiB)
20:14:32.946 INFO SparkContext - Created broadcast 530 from broadcast at DAGScheduler.scala:1580
20:14:32.946 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
20:14:32.946 INFO TaskSchedulerImpl - Adding task set 261.0 with 1 tasks resource profile 0
20:14:32.947 INFO TaskSetManager - Starting task 0.0 in stage 261.0 (TID 317) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
20:14:32.947 INFO Executor - Running task 0.0 in stage 261.0 (TID 317)
20:14:32.951 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
20:14:32.951 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
20:14:32.962 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:32.962 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:32.962 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:32.962 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
20:14:32.962 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
20:14:32.962 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
20:14:32.985 INFO FileOutputCommitter - Saved output of task 'attempt_202502102014328358470122464060801_1258_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest12260819647547947603.bam.parts/_temporary/0/task_202502102014328358470122464060801_1258_r_000000
20:14:32.985 INFO SparkHadoopMapRedUtil - attempt_202502102014328358470122464060801_1258_r_000000_0: Committed. Elapsed time: 0 ms.
20:14:32.985 INFO Executor - Finished task 0.0 in stage 261.0 (TID 317). 1858 bytes result sent to driver
20:14:32.986 INFO TaskSetManager - Finished task 0.0 in stage 261.0 (TID 317) in 39 ms on localhost (executor driver) (1/1)
20:14:32.986 INFO TaskSchedulerImpl - Removed TaskSet 261.0, whose tasks have all completed, from pool
20:14:32.986 INFO DAGScheduler - ResultStage 261 (runJob at SparkHadoopWriter.scala:83) finished in 0.047 s
20:14:32.986 INFO DAGScheduler - Job 197 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:32.986 INFO TaskSchedulerImpl - Killing all running tasks in stage 261: Stage finished
20:14:32.986 INFO DAGScheduler - Job 197 finished: runJob at SparkHadoopWriter.scala:83, took 0.112351 s
20:14:32.986 INFO SparkHadoopWriter - Start to commit write Job job_202502102014328358470122464060801_1258.
20:14:32.991 INFO SparkHadoopWriter - Write Job job_202502102014328358470122464060801_1258 committed. Elapsed time: 4 ms.
20:14:33.002 INFO HadoopFileSystemWrapper - Concatenating 3 parts to file:////tmp/ReadsSparkSinkUnitTest12260819647547947603.bam
20:14:33.006 INFO HadoopFileSystemWrapper - Concatenating to file:////tmp/ReadsSparkSinkUnitTest12260819647547947603.bam done
20:14:33.006 INFO IndexFileMerger - Merging .sbi files in temp directory file:////tmp/ReadsSparkSinkUnitTest12260819647547947603.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest12260819647547947603.bam.sbi
20:14:33.011 INFO IndexFileMerger - Done merging .sbi files
20:14:33.011 INFO IndexFileMerger - Merging .bai files in temp directory file:////tmp/ReadsSparkSinkUnitTest12260819647547947603.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest12260819647547947603.bam.bai
20:14:33.016 INFO IndexFileMerger - Done merging .bai files
20:14:33.018 INFO MemoryStore - Block broadcast_531 stored as values in memory (estimated size 320.0 B, free 1916.6 MiB)
20:14:33.019 INFO MemoryStore - Block broadcast_531_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.6 MiB)
20:14:33.019 INFO BlockManagerInfo - Added broadcast_531_piece0 in memory on localhost:35739 (size: 233.0 B, free: 1919.4 MiB)
20:14:33.019 INFO SparkContext - Created broadcast 531 from broadcast at BamSource.java:104
20:14:33.020 INFO MemoryStore - Block broadcast_532 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
20:14:33.026 INFO MemoryStore - Block broadcast_532_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.2 MiB)
20:14:33.026 INFO BlockManagerInfo - Added broadcast_532_piece0 in memory on localhost:35739 (size: 50.2 KiB, free: 1919.3 MiB)
20:14:33.026 INFO SparkContext - Created broadcast 532 from newAPIHadoopFile at PathSplitSource.java:96
20:14:33.035 INFO FileInputFormat - Total input files to process : 1
20:14:33.049 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
20:14:33.049 INFO DAGScheduler - Got job 198 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
20:14:33.049 INFO DAGScheduler - Final stage: ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182)
20:14:33.049 INFO DAGScheduler - Parents of final stage: List()
20:14:33.049 INFO DAGScheduler - Missing parents: List()
20:14:33.049 INFO DAGScheduler - Submitting ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:33.055 INFO MemoryStore - Block broadcast_533 stored as values in memory (estimated size 148.2 KiB, free 1916.1 MiB)
20:14:33.056 INFO MemoryStore - Block broadcast_533_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1916.0 MiB)
20:14:33.056 INFO BlockManagerInfo - Added broadcast_533_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.3 MiB)
20:14:33.056 INFO SparkContext - Created broadcast 533 from broadcast at DAGScheduler.scala:1580
20:14:33.056 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:33.056 INFO TaskSchedulerImpl - Adding task set 262.0 with 1 tasks resource profile 0
20:14:33.056 INFO TaskSetManager - Starting task 0.0 in stage 262.0 (TID 318) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:33.057 INFO Executor - Running task 0.0 in stage 262.0 (TID 318)
20:14:33.072 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest12260819647547947603.bam:0+237038
20:14:33.076 INFO Executor - Finished task 0.0 in stage 262.0 (TID 318). 651483 bytes result sent to driver
20:14:33.077 INFO TaskSetManager - Finished task 0.0 in stage 262.0 (TID 318) in 21 ms on localhost (executor driver) (1/1)
20:14:33.077 INFO TaskSchedulerImpl - Removed TaskSet 262.0, whose tasks have all completed, from pool
20:14:33.077 INFO DAGScheduler - ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.028 s
20:14:33.077 INFO DAGScheduler - Job 198 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:33.077 INFO TaskSchedulerImpl - Killing all running tasks in stage 262: Stage finished
20:14:33.077 INFO DAGScheduler - Job 198 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.028559 s
20:14:33.087 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:33.087 INFO DAGScheduler - Got job 199 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:33.087 INFO DAGScheduler - Final stage: ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185)
20:14:33.087 INFO DAGScheduler - Parents of final stage: List()
20:14:33.087 INFO DAGScheduler - Missing parents: List()
20:14:33.087 INFO DAGScheduler - Submitting ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:33.104 INFO MemoryStore - Block broadcast_534 stored as values in memory (estimated size 426.1 KiB, free 1915.6 MiB)
20:14:33.109 INFO BlockManagerInfo - Removed broadcast_523_piece0 on localhost:35739 in memory (size: 153.6 KiB, free: 1919.4 MiB)
20:14:33.109 INFO BlockManagerInfo - Removed broadcast_524_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.4 MiB)
20:14:33.109 INFO BlockManagerInfo - Removed broadcast_526_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:33.110 INFO MemoryStore - Block broadcast_534_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.6 MiB)
20:14:33.110 INFO BlockManagerInfo - Removed broadcast_528_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:33.110 INFO BlockManagerInfo - Added broadcast_534_piece0 in memory on localhost:35739 (size: 153.6 KiB, free: 1919.4 MiB)
20:14:33.110 INFO SparkContext - Created broadcast 534 from broadcast at DAGScheduler.scala:1580
20:14:33.110 INFO BlockManagerInfo - Removed broadcast_533_piece0 on localhost:35739 in memory (size: 54.5 KiB, free: 1919.4 MiB)
20:14:33.110 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:33.110 INFO TaskSchedulerImpl - Adding task set 263.0 with 1 tasks resource profile 0
20:14:33.110 INFO BlockManagerInfo - Removed broadcast_527_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.4 MiB)
20:14:33.111 INFO TaskSetManager - Starting task 0.0 in stage 263.0 (TID 319) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
20:14:33.111 INFO Executor - Running task 0.0 in stage 263.0 (TID 319)
20:14:33.111 INFO BlockManagerInfo - Removed broadcast_530_piece0 on localhost:35739 in memory (size: 67.0 KiB, free: 1919.5 MiB)
20:14:33.111 INFO BlockManagerInfo - Removed broadcast_520_piece0 on localhost:35739 in memory (size: 9.6 KiB, free: 1919.5 MiB)
20:14:33.112 INFO BlockManagerInfo - Removed broadcast_521_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.5 MiB)
20:14:33.112 INFO BlockManagerInfo - Removed broadcast_515_piece0 on localhost:35739 in memory (size: 50.2 KiB, free: 1919.6 MiB)
20:14:33.113 INFO BlockManagerInfo - Removed broadcast_522_piece0 on localhost:35739 in memory (size: 3.8 KiB, free: 1919.6 MiB)
20:14:33.113 INFO BlockManagerInfo - Removed broadcast_529_piece0 on localhost:35739 in memory (size: 166.1 KiB, free: 1919.8 MiB)
20:14:33.140 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
20:14:33.150 INFO Executor - Finished task 0.0 in stage 263.0 (TID 319). 989 bytes result sent to driver
20:14:33.150 INFO TaskSetManager - Finished task 0.0 in stage 263.0 (TID 319) in 39 ms on localhost (executor driver) (1/1)
20:14:33.150 INFO TaskSchedulerImpl - Removed TaskSet 263.0, whose tasks have all completed, from pool
20:14:33.150 INFO DAGScheduler - ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
20:14:33.150 INFO DAGScheduler - Job 199 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:33.150 INFO TaskSchedulerImpl - Killing all running tasks in stage 263: Stage finished
20:14:33.150 INFO DAGScheduler - Job 199 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063545 s
20:14:33.153 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
20:14:33.154 INFO DAGScheduler - Got job 200 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
20:14:33.154 INFO DAGScheduler - Final stage: ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185)
20:14:33.154 INFO DAGScheduler - Parents of final stage: List()
20:14:33.154 INFO DAGScheduler - Missing parents: List()
20:14:33.154 INFO DAGScheduler - Submitting ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
20:14:33.160 INFO MemoryStore - Block broadcast_535 stored as values in memory (estimated size 148.1 KiB, free 1918.6 MiB)
20:14:33.160 INFO MemoryStore - Block broadcast_535_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.6 MiB)
20:14:33.161 INFO BlockManagerInfo - Added broadcast_535_piece0 in memory on localhost:35739 (size: 54.5 KiB, free: 1919.7 MiB)
20:14:33.161 INFO SparkContext - Created broadcast 535 from broadcast at DAGScheduler.scala:1580
20:14:33.161 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
20:14:33.161 INFO TaskSchedulerImpl - Adding task set 264.0 with 1 tasks resource profile 0
20:14:33.161 INFO TaskSetManager - Starting task 0.0 in stage 264.0 (TID 320) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
20:14:33.161 INFO Executor - Running task 0.0 in stage 264.0 (TID 320)
20:14:33.172 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest12260819647547947603.bam:0+237038
20:14:33.176 INFO Executor - Finished task 0.0 in stage 264.0 (TID 320). 989 bytes result sent to driver
20:14:33.176 INFO TaskSetManager - Finished task 0.0 in stage 264.0 (TID 320) in 15 ms on localhost (executor driver) (1/1)
20:14:33.176 INFO TaskSchedulerImpl - Removed TaskSet 264.0, whose tasks have all completed, from pool
20:14:33.176 INFO DAGScheduler - ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.022 s
20:14:33.176 INFO DAGScheduler - Job 200 is finished. Cancelling potential speculative or zombie tasks for this job
20:14:33.176 INFO TaskSchedulerImpl - Killing all running tasks in stage 264: Stage finished
20:14:33.176 INFO DAGScheduler - Job 200 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022766 s